Unit Test Presentation again
I had another Unit Test Presentation at BASD.
This time my presentation was focused over the theoretical aspects concerning UnitTesting, TDD and Continues Integration.
Software development is not a job. It is a style of living
I had another Unit Test Presentation at BASD.
This time my presentation was focused over the theoretical aspects concerning UnitTesting, TDD and Continues Integration.
By Ruslan Trifonov on Friday, April 29, 2005 0 comments
While swimming into the Test Driven Development infosphere, I found this interesting site dedicated to Refactoring
One of the most interesting part of the site is the refactoring thumbnails section
By Ruslan Trifonov on Friday, April 22, 2005 0 comments
A good reading about agile software development techniques may be found on AgileKiwi
By Ruslan Trifonov on Tuesday, April 19, 2005 0 comments
Required reading Service Locator pattern and .NET post.
What about writing unit tests for our Service Locator?
Actually they may look quite straight forward. It is even esier, because we dont have to deal with a complicated service instantiation code.
Having the sample implementation from the prev. post, we may write the following code in order to test our consumer class. Actually some changes are needed in order to use the code in unit testing. Sample code may be obtained from here
[TestFixture]
public class ServiceConsumerTest
{
ServiceConsumer consumer;
[SetUp]
public void SetUp()
{
consumer = new ServiceConsumer();
}
[TearDown]
public void TearDown()
{
consumer=null;
}
[Test]
[ExpectedException(typeof(ApplicationException))]
//we expect our consumer to raise exception if not in container
public void NotContainedConsumer()
{
consumer.Process();
}
[Test]
public void ContainedConsumer()
{
//embed the consumer into our container
ConfigurableContainer container = new ConfigurableContainer();
container.Add(consumer);
//prepare App configuration - we may test our container with various services
imlpementations and even mocked services
//call processing
consumer.Process();
//check expected result....and assert
}
}
In order to have this test working with VS2003, we need to add the following line in
menu Project/Properties/Build Events/Post Build Event Command Line
copy "$(ProjectDir)App.config" "$(TargetPath).config"
This will force VS2003 to copy our configuration file into the output directory, thus making it available for NUnit and our tests.
Of course, we may mangle the configuration file in our test in order to perform tests with other service implementations and even mocked services.
Better aproach could be to provide API for container configuration.
By Ruslan Trifonov on Thursday, April 07, 2005 0 comments
Introduction
One way to break the dependency between 2 classes is to use Service Locator pattern. Recently I had a post about Dependency Injection and .NET. I pointed out, that one of the ways to break the depndency between 2 classes is to use the built-in capabilities of the .NET framework to create lightweight containers. Let us try to implement a simple container using the classes and interfaces from the System.ComponentModel namespace
Implementation
Sample source code may be downloaded from here
First, we have to define our problem. Lets say, we need to imlpement a class to process some data. This class is named ServiceConsumer. We want our ServiceConsumer to have the ability to consume data from diffeent sources like emails and files. In order to decouple our ServiceConsumer class from the actual source data fetching, we will declare an interface called
ICommonService. We will imlpement two versions of ICommonService called EmailService and
FileService.
The ServiceConsumer class imlpementation looks like this:
1 public class ServiceConsumer:Component
2 {
3 public ServiceConsumer():base()
4 {
5 }
6 public void Process()
7 {
8 ICommonService service=
9 (ICommonService)GetService(typeof(ICommonService));
10 service.Execute();
11 }
12 }
Now, we want to write the code to instantiate our actual service implementation. We'll do it smarter by using the configuration namesapce of the .NET framework. First, we will create our application xml configuration file:
<configuration>
<appSettings>
<add key ="ServiceLocatorSample.ICommonService"
value="ServiceLocatorSample.EmailService" />
</appSettings>
</configuration>
Following the implementation of our container. In the sake of simplicity our container contains the code to deal with the actual service implementation. It reads our configuration file and create the proper service instance.
1 public class ConfigurableContainer:Container
2 {
3 public ConfigurableContainer():base()
4 {
5 }
6 protected override object GetService(Type service)
7 {
8 string implementationTypeName =
9 ConfigurationSettings.AppSettings[service.FullName];
10 Type actualType = Type.GetType(implementationTypeName);
11 if(actualType!=null)
11 {
12 return Activator.CreateInstance(actualType);
13 }
14 return base.GetService(service);
15 }
16 }
1 string consumerName = "consumer1";
2 containter = new ConfigurableContainer();
3 ServiceConsumer consumer = new ServiceConsumer();
4 containter.Add(consumer,consumerName);
5 ServiceConsumer myConsumer =
6 ((ServiceConsumer)containter.Components[consumerName])
7 myConsumer.Process();
Conclusion
What are the benefits of using lightweight containers in .NET :
By Ruslan Trifonov on Wednesday, April 06, 2005 0 comments
Loose Coupling
When reading about software design, one will obviously meet the magic words loose coupling. This rather abstract concept is often considered as a metric for code quality and testability.
Besides the pure theoretical speculations, there are many significant works, which light up some practical aspects of the loose coupling concept. One should read the Inversion of Control Containers and the Dependency Injection pattern by Martin Fowler 'cause his work gives some great ideas about the dependency breaking problem.
.NET and Inversion of Control
What are the ways to break the dependency in .NET in elegant&efficient manner?
However the right aproach should be evaluated in a particular context. The impact over the code testability (the practical aspect) should be evaluated for example.
By Ruslan Trifonov on Wednesday, April 06, 2005 0 comments
I had a short presentation of Unit Testing and .NET on 30.03.2005 during the monthly metting of SofiaDev.org
It turns out my presentation skills are not quite impressive.
Fortunatelly, the google have an answer again - LILSeminaras.com
They have some free presentation tips. However, I suppose the real practice is the best option.
I like the idea to use my blog as a dairy.
Well not a detailed one, but it may give me one day a clue about my personal and professional wandering
By Ruslan Trifonov on Tuesday, April 05, 2005 0 comments
I was digging into the Continuous Integration theme and fall on the Martin Fowler's Continuous Integration article again - a classic in this field.
And some thoughts came to my mind...
I've always been an extreme fan of the development process automation. I think it may be beneficial for an organizations to automate most of the activities during the development process. This may be one of the ways to transform current software handcrafting into a real industry.
Why automate:
A modern approach in software development activities is the iteration based development. It means that your project steps through a number of iterations. Each iteration, the team adds new features and fixes defects, injected in the previous ones. A key aspect of this aproach is that each iteration includes a lot of similar activities. These activities are vital for the project. However, the team should spend some time for managing, performing and verifying the results from the activities and it costs money. The time and money for these activities are spend every iteration. However there is one other problem - people’s ability to make mistakes. Mistakes may manifest them selves later, often when it’s too late. Some methods to mitigate the risk of mistakes are people's training, personal and group reviews, etc. Such an approach may lower the probability of failing due to mistakes or lack of skills , however it is pretty expensive and also tend to be a constant expense. And while training and reviews will not be avoided another weapon may be added to the organization’s armoury - development process automation techniques.
The benefit from automation
Some ideas of what may be automated:
By Ruslan Trifonov on Monday, April 04, 2005 0 comments
The scientific theories about the complex systems like weather, biological brains, social groups and ant colonies often talk about the emerging characteristics phenomenon. The complex systems tend to demonstrate a complex structure and behavior. It is usually difficult ( or even impossible) for humans to map the observed system behavior to a number of low-level intra-system interactions.
We may have a good knowledge about the characteristics of the simple
building components of the system and about the observed behavior of the system as whole, but it’s difficult to analyze the dynamics. The large number of simple components interactions within the system forms the complexity of the system dynamics. The system demonstrates behavior characteristics, which are far beyond the characteristics of a simple building component.
Software systems may be considered as such complex systems. Although much simpler than the human brain, there are software systems, which are complex enough to be analyzed by a single man as a whole. The standard way to guarantee the quality of the software systems today is to perform post-quality control activities. It usually involves the QA division members, who perform a number of manual or automated tests upon the system as a whole. The effects observed are classified as expected or defects. Then the defects should be analyzed and mapped to specific simpler building components in order to be fixed. It is often not trivial to say if the defect observation is due to static or collaboration issue, if this is a design or coding issue. It is often difficult to find the exact problematic code section. And what is always valid is that this process tends to be slow, inefficient and expensive. Organization, often do not try to analyze the defect injection reasons. They are not trying to optimize the development process and to minimize the cost of the post-control activities. However this is another story.
My point of view is that most of the post-quality control procedures, which act upon the system as a whole, can’t on their own output the quality that we want. A major reason is the complexity and the emergency characteristics phenomenon. The integrated system has a complexity far beyond its simpler building components. What, if we add another approach to our currently used defects-prevention methods. It will be beneficial trying to decrease the defects injection rate before the system integration. It is just much easier to control the quality of the simplest system component. And we have a good candidate tools. Unit tests theory is a good way to control the quality of the simplest system components. It’s even better - unit tests focus the developer over the vital component usage issue thus helping to produce better low-level design.
A major argument against using unit testing is the additional development overhead. After all it costs money and time. The first answer that comes to mind is “The quality costs money”. Unfortunately, this answer will not convince managers to invest a bunch of money in another quality- control methodology. However, it is not quite true that unit testing increases the production cost .Unit testing actually will decreases the total cost of the production cycle by reducing the cost of the refactoring/changes, reducing the defect-injection rate thus making the post-quality control activities cheaper. It makes developers feel sure about their code thus motivating them to innovate.
By Ruslan Trifonov on Friday, April 01, 2005 0 comments