Monday, June 23, 2008


Sendmail is a mail transfer agent (MTA) that is a well-known project of the open source, free software and Unix communities, which is distributed both as free software and proprietary software. Its also called the mail server daemon. Other mail server daemons include qmail, postfix, exim, mmdf, smail etc. server daemons looks after the receiving incoming mail and delivers outgoing mail.

Sendmail configuration files
All sendmail related configuration files have to be put in /etc/mail directory.


It specifies what hosts or IP addresses have access to the local mail server and what kind of access(OK, REJECT,RELAY) they have.
OK - allowed to send mail to the host as long as mail's final destination is the local machine.
REJECT - reject for all mail connections
RELAY - allowed to send mail for any destination through this server.
e.g.     550 we dont expect mail from you
a.source.of.spam     REJECT     OK
128.32          RELAY

This database contains a list of virtual mailboxes that are expanded to other user(s), files, programs or other aliases.
root: ajantha
customers: :include: /etc/mail/lists/customer-list

The last alias causes how to keep a list of users for an aliases in an external file.

Consist list of host names that sendmail is to accept as the local mail host. if the mail server was to accept mail for the domains and the above file will contain:

It controls the overall behavior of sendmail. The master sendmail configuration file can be built from m4 macros.

This maps email addressse for virtual domains and mailboxes to real mail boxes. The mail boxes can be local, remote, aliases defined.
e.g ajantha saman

Tuesday, June 17, 2008

Containers and Servlets

A Servlet container is a specialized web server that supports Servlet execution. They map the request URLs to specific servlets..but how? Containers know the interface between the servlets and the web server. no specific API is needed for this.

The container receives a HTTP request.

The container creates request and response objects.

It initiates a new thread to handle the request. Then passes the two objects in to the thread.

Container calls the service() method of the servlet which would call the doGet() or doPost() methods, based on the request type.

The doGet() method returns the result, writes the HTML and stuffs it into the response object.

The HTML page is sent back to the client, the response, request objects are deleted and the thread is killed.

Hope this would have given a clear picture on what happens to the request for a servlet. but still there is more. Here the business logic and presentation are tied together which is not a very good OO practice. you can separate them using JSPs.. MVC improves on it even more...We'll come back to that later..

Wednesday, June 11, 2008

ERP Systems Overview

The above figure shows an ERP model.

When we say ERP Systems it’s basically the integration of several data sources and processes of an organization in to a unified system with a unified database to store various system modules. Modular software design is the core concept of ERP systems which allows individual components/modules to be added to the system. These modular software designs link into the common database, so that all of the information between is accessible in real time. ERP systems integrated separate systems of an organization which led to minimization of specialties in an organization.

When most people refer to the “core” ERP applications or “modules,” they mean the back-office capabilities to manage human resources, accounting and finance, manufacturing, and project-management functions. However, major ERP suites from Oracle, PeopleSoft, and SAP now provide much more—including modules for sales force automation, business intelligence, customer relationship management, and supply chain management. We had a practical assignment at university where we had to plan an ERP system for the Sri Lankan Railways. The document is available here. It includes study of the existing system, the proposed system features, the risks and rewards in implementing it etc. I’m sure it will provide a practical view.

Characteristics of ERP systems

• In ERP systems, information is often recorded in a form that cannot be read without the use of a computer.

• Financial and business information is often generated automatically by ERP systems based on data previously entered, without further human instructions.

• Errors that might be observed in non-ERP systems may go undetected because of the reduced human involvement in computerized processing. There is a danger that errors in processing may be applied to a large number of transactions without being noticed.

• With proper controls, ERP systems can be more reliable than non-ERP systems. This is because ERP systems subject all data to the same procedures and controls. Non-ERP systems are subject to random human error. Although computer processing will usually be consistent, errors may still occur; for example, if the computer is incorrectly programmed.

• But still it is difficult to make changes after an ERP system has been implemented. Therefore, we should be aware of the organization’s plans to introduce significant new systems or to make major modifications to existing systems. It is advisable to review new systems or modifications before implementation so that a preliminary assessment can be made of the adequacy of control procedures, in order to ensure an adequate audit trail, and to plan any necessary changes in the audit approach.

Monday, June 9, 2008

Using SSH

SSH is a program to log into another computer over a network, to execute commands in a remote machine, and to move files from one machine to another. It provides strong authentication and secure communications over insecure channels. SSH is most useful when logging into a UNIX machine from another machine where the traditional telnet and rlogin programs would not provide password and session encryption.

Now we'll take a look at some basic SSH commands in networking.
To login to a remote machine running sshd server you can use the any of the following formats.

# ssh -l remote_user host_name_or_ip
# ssh remote_user@host_name_or_ip
# ssh -l root
# ssh

To execute a command on the remote machine without logging in explicitly you can use,
# ssh -l remote_user host_name_or_ip remote_command
# ssh -l bud@ /usr/bin/x11/xclock
It starts the xclock application in the remote machine.

To ftp securely you can use sftp.
# sftp bud@

sftp> cd downloads - Used to change directory
sftp>mput *.rpm - Used to upload multiple files
sftp>mget httpd* - Used to download multiple files
sftp>help - See a list of commands

To copy files from one host to another try
# scp [-r] source user@remote_host:target
# scp bud@ /tmp

Still there is SSH server and client configuration and do you knw how to configure passwordless SSH login?

AdventNet QEngine 6 - Web Performance Test Tool

Lets have some practical experience on a testing tool. AdventNet QEngine Web Performance Test tool is a powerful, easy-to-use and affordable web load testing tool to quickly test the performance of your web sites and web-based applications. Its automatic analysis enables you to accurately simulate the traffic of thousands of users to identify and isolate performance bottlenecks and optimize user experience within minutes.

Main Features of AdventNet QEngine 6
  • Test Scheduling and Command Line Toolkit
  • Web Performance Testing (Load Testing)
  • Web Services Testing (Functional & Performance)
  • QEngine Issue Tracker
  • QEngine Toolbar for remote record/playback of user actions
  • Browser-based Web testing
  • Jython Test Scripts
  • Environment Independent Tests
  • Powerful Script Editor
The basic steps involved in QEngine Load Testing testing are as follows:

The detail picture of testing with QEngine is provided here.

Now we'll take a look at some of the outputs of the load testing with QEngine.

Transaction Status Summary

The above figure shows a transaction status summary graph provides a snapshot of Requests pending, Response pending & Response status such as, download started and download completed. If pending Request / Response is too high then check the test duration in Summary Report and the error distribution graph in Error Report to identify the problem.

Hits Per Second

This graph shows the number of HTTP/S requests made by Virtual users to the server during each second of the run.

URL wise Response Time Report

More details about QEngine testing is available in the link provided above. The report also speaks of another load and stress testing tool called WAPT 5.0 that provides with an easy-to-use, consistent and cost-effective way of testing web sites, web servers, and intranet applications with web interfaces.

A Little Survey on Software QA Tools

Many modern software systems consist of large sets of heterogeneously developed components. Object-oriented design, component-based software engineering, components off-the-shelf (COTS), design patterns, and open source software facilitate the development tasks, but assuring the quality in scenarios that entail (combinations of) these concepts is problematic.

Basically so many categories of test tools are available in the industry today. Some of the most important categories of these test tools are listed below.

  • Application Test Tools
  • Web Test Tools
  • Test Management Tools
  • Bug Tracking Tools
  • API Test Tools
  • Communications Test Tools
  • Requirements Management Tools

Most of the test tools are available in open source too. These tools are free. There is no need to pay a vendor to use these tools, but they still have a cost of ownership through evaluation, implementation, training and maintenance costs, the same as any software does. But with a handful of commercial vendors dominating the proprietary software testing tools market, make no mistake about it.

The paradox of quality assurance is that, although it’s a key value for every organization, the actions taken to ensure it are often left until late in the lifecycle when budgets are scarce, time is short and there is high pressure to deliver to the market. As a result there are often challenges associated with improving the software development process, reducing costs, improving quality and increasing reliability of planning. But so far the efforts to overcome these challenges are being successfully handled by software QA professionals to best safeguard the quality of good software.

From my viewpoint I found that future of software testing and quality assurance will be doing more on the web. The evolution in web development over the past two to three years has ushered in a new set of challenges for software quality professionals across the board; specifically, the emergence of Web 2.0 and the introduction of Ajax and SaaS architectures as new approaches for building rich content applications for the web. Such shifts combined with the need for enterprises to be agile and deliver products to market in shorter product cycles, has dramatically challenged many of the existing testing tools and rendered many old record and replay approaches almost obsolete. A detail survey of mine on these QA tools is attached here. It also consists descriptions of various leading testing tools in market today.

Today it will be impractical to rely too much on human configurations, tests, debugging, and management because software systems today are of very large scale and contain numerous components and users. It is believed that, it is time to bring a new generation of cutting-edge technologies and innovative processes to software testing that will assist software quality professionals in building better software easier in the near future.

Sunday, June 8, 2008

HCI's role in e-learning

Today the eLearning environment is dramatically changing the way of students, employees and indeed all members of the general public learning new knowledge and performing learning activities. Effective learning occurs where students actively participate in the learning process and when they have ownership of what and how they learn, and are supported in appropriate ways. That is where e-learning has always been successful but still, making elearning sustainable in a traditional education environment involves many challenges.

A major challenge currently faced by e-learning systems’ designers is the development of improved tools better able to engage new learners and sustain their online learning activities any time and anywhere. E-learning systems should be designed in such a way so that its more usable and innovative, supporting creative learning, based on strategies which guides the learners to make the most effective use of the learning content. The approach to e-learning should be

  • learner centered
  • digitally minded
  • research based
  • focused on quality
  • innovative and
  • providing leadership.
Human- Computer Interaction (HCI) theories and methodologies can support the design of appropriate e-learning settings responding to the requirements of today’s e-learning environment which were shown above. It will make e-learning applications smart enough to adapt themselves to the students’ learning styles and to assure high standards of accessibility and usability, in order to make learners’ interaction with the systems as natural and intuitive as possible. In the context of Human-Computer Interaction it is important to consider a perspective that recognizes, respects, values and attempts to accommodate a wide range of human abilities, skills, requirements and preferences in the design of learning material. This automatically reduces the need for a lot of special features. It also encourages individualization, high quality of interaction and, ultimately, end-user acceptability. In short the focus here is always on the human user.

Analysis of learners’ preferred interactions with e-learning environment, and a learner-centered design perspective which takes into account also the typical learning styles shared within the different cultural contexts, are the key factors that would contribute to the successful integration of HCI in e-learning.

Datamining using RapidMiner 4.1

Data mining involves searching through databases for correlations and patterns that differ from results that would be anticipated to occur by chance or in random conditions. The practice of data mining in and of itself is neither good nor bad and the use of data mining has become common in many industries. I participated in a datamining assignment which analyzed the export patterns of Gems and Jewelry in Sri Lanka. the tool used was RapidMiner 4.1. Some most important facts were found out in the process.
The process of data mining consists of three stages:
(1) the initial exploration,
(2) model building or pattern identification with, and
(3) deployment

Stage 1: Exploration
This stage usually starts with data preparation which may involve cleaning data, data transformations, selecting subsets of records and - in case of data sets with large numbers of variables ("fields") - performing some preliminary feature selection operations to bring the number of variables to a manageable range

The figure shows a screenshot of a graph which was made on the gem exports using Rapid Miner.

Stage 2: Model building and validation

The dimensional model must suit the requirements of the users and support ease of use for direct access. The model must also be designed so that it is easy to maintain and can adapt to future changes.

The figure shows a 2D Model View of the data.

The model design must result in a relational database that supports OLAP cubes to provide instantaneous query results for analysts. a typical dimensional model uses a star or snowflake design that is easy to understand and relate to business needs, supports simplified business queries, and provides superior query performance by minimizing table joins.

Stage 3: Deployment
That final stage involves using the model selected as best in the previous stage and applying it to new data in order to generate predictions or estimates of the expected outcome.

I have attached my datamining asignment report here,