NAVmoble - the pocket-sized ERP
Optimized for Microsoft Dynamics NAV and Windows Mobile powered devices

Wednesday, December 14, 2005

May be I should go to school again

You Passed 8th Grade Science

Congratulations, you got 7/8 correct!

Tuesday, December 13, 2005

Compact Framework Team Top 10

Compact Framework Team has a new post on their blog, where links to 13 of their best posts are listed. It's worth taking some of your time to take a look.

Tuesday, December 06, 2005

Hard time presenting .NET Compact Framework

···I spent 2 days in Velingrad last weekend, where NASD had an event dedicated to the mobile development. There were different presentors focusing primary over the Java mobile solutions. Tihomir Ignatov, iFD Engineering GmbH presented the .NET Compact Framework idea for mobile development. I've made a session dedicated to .NET Compact Framework development for performance.
···We had quite hard time! It turned out we are totally unprepared about this event(we wasn't supplied with the right audience profile). Our presentations targeted developers, who have at least some Full .NET Framework knowledge. It turns out actually that there were few people familiar with this kind of technlogies. We have prepared to present a set of advanced CF development techniques ,however we had to imrpovise trying to have the audience interested.
···We had a great late evening discussion till the morning in the local pubs:)

Thursday, December 01, 2005

Smart Device Project Conversion Failure(Part 2)

NOTE: This is my personal opinion and I do not give any guarantee. It is your responsibility if you try this

I was finally able to make my VS2005 building projects against Compact Framework 1.0
See my previous post VS2003-To-VS2005 Solution conversion failure

Actially the solution for me came after the precious help from Sriram

So in short the problem is manifested as follows:
1. I can't convert VS2003 Smart Device Project
2. I can't create new projects from VS2005 for Compact Framework 1.0

Note that I have installed VS2003 on the same machine and do not have problem targeting Compact Framework 1.0.
I reinstalled Compact Framework 1.0 SP3, however the problem was still there.
The partial solution in my previous post works only if you target Compact Framework 2.0

So here comes the solution:
I opened my registry (HKLM/Microsoft/Software/.NETCompactFramework) and noticed that I do not have key named v1.0.5000.0 This was actually noted by Sriram
I exported this key from another machine where the conversion to VS2005 was not broken and imported it into my registry. However I've corrected the path strings to point my VS2005 installation

Then tried to convert VS2003 SDF project to VS2005 and successed!

The came the second problem. I've opened the converted project however the form designer looked terrible and I was not able to compile it.
The solution:
I've checked under C:\Program Files\Microsoft Visual Studio 8\SmartDevices\SDK\CompactFramework\2.0 and noticed there is only a folder named v.2.0
I've found that I have folder D:\Program Files\Microsoft Visual Studio 8\SmartDevices\SDK\CompactFramework\1.0 - note it is on disk D:
I copied its content into C:\Program Files\Microsoft Visual Studio 8\SmartDevices\SDK\CompactFramework\2.0\v1.0

And ops my VS2005 is back to life - I am able to compile CF1.0 projects

Unfortunately I can't actually tell the exact steps which caused this problem. The only thing I know for sure is that it is somehow related to the VS2005 Beta2 and my trial to clear it before installing the final one.
It also may related to my few trials to re-install VS2005.
There is also another problem now - Toolbox does contain any controls and I am looking forward to solve this too.
The easiest way maybe is to start with clear Windows, however I do not have time to do it.

Wednesday, November 30, 2005

The first portable computer

While reading around, I was able to find a proof about how fast the technology goes forward. See a picture of the first "portable" computer called Osborne-1, created in 1981. It was sized to fit under an airplane seat and was armed with Basic,WordStar and SuperCalc. So it seems you had to buy an airplane to carry it. Now we buy bags and belts and pants with pockets. :)

Monday, November 21, 2005

SKU Matrix for Windows Mobile Version 5.0

Are you a looking for a Windows Mobile 5 brief feature list?
This is the place where you may start reading

Sunday, November 20, 2005

.NET Compact Framework 2.0 Performance and Reflection

   If you spent your time researching performance improvement tips, soon you will find that reflection is a great but expensive feature in terms of performance.
Let's see some metrics, then!
   My test environment is just the same as in my previous post
We've got HTC QTEC 9090, VS2005,CF2.0,Sql Ce 3.0.
I used the same database structure and 10000 rows to fetch. Every scenario makes 5 subsequent runs. Application is restarted before each test scenario run. Scenario 1 is not listed here , because it uses DataRow instances to represent database table rows.
The code is almost the same. I changed only the way the BObject instance is created. It is created by using Activator.CreateInstance method, this time.

List<BObject> boList = new List<BObject>();
...
IDataReader reader = cmd.ExecuteReader();
...
while(reader.Read())
{
//This was the old instance creation
//BObject bo = new BObject();


//This is the new extra time-consuming instance creation
BObject bo = (BObject)Activator.CreateInstance(typeof(BObject));
bo.id = reader.GetInt32(0);
bo.data1 = reader.GetString(1);
bo.data2 = reader.GetDateTime(2);
bo.data3 = reader.GetDouble(3);
boList.Add(bo);
}


I repeated the same testing scenarios and here come results:
Scenario 2: SqlCeDataReader
#Time[ms]Memory[bytes]Time increase[%]
11006496221699
283441125212142
382611295408136
482881422712143
58333952280129

Scenario 3-a:SqlCeResultSet,None
#Time[ms]Memory[bytes]Time increase[%]
11057696206497
290911125200133
390291295396131
490621422700129
59019952128133

Scenario 3-b:SqlCeResultSet,Insensitive
#Time[ms]Memory[bytes]Time increase[%]
115486996206427
214016112520029
313886129539628
414118142270029
51410595212830

Scenario 3-c:SqlCeResultSet,Scrollable,Updatable,Sensitive
#Time[ms]Memory[bytes]Time increase[%]
11405496206442
212494112520070
312369129539664
412471142270065
51240695212867


   We've got ~25-140 percents time increasing, due to using this resource hungry feature! There is also a small memory overhead. It is not as meaningful as the time increase. However the memory increase was stable across all the test scenario runs, that I've performed. It is interesting that testing scenarios 2 and 3-a(the faster data access methods) report far greater time increasing compared with the rest ones. The time diff. between scenario 3-a and 3-b in the previous test cases had a mean value of ~60%. In this case the diff. is only ~30% due to performance loss in scenario 3-a.
   One of the reasons, which motivated me to perform these tests was that Activator.CreateInstance pattern is very useful, when extendability and flexibility are required.
And while this pattern is frequently used in the desktop environment, it comes with a great cost in the CF world.

Saturday, November 12, 2005

.NET Compact Framework 2.0 Data Access Performance

Further reading .NET Compact Framework 2.0 Performance and Reflection

I took little of my time today to prepare a VS2005 SD project to test the performance differences between some of the SqlCe data access options. I have the following test environment:
Device: HTC QTEK 9090
OS: Windows Mobile 2003 SE
Dev.Env: VS2005 , CF2.0,SqlCe 3.0
I've created a table with the following Sql:

CREATE TABLE test1
(
id int IDENTITY NOT NULL,
data1 nvarchar(100),
data2 datetime,
data3 float
)


I've populated it with 10000 rows, using the following Sql:
insert into test1(data1,data2,data3)values('some string',getdate(),123)
I measured then the performance, while reading rows from SqlCe 3.0 by using the following testing scenarios:

scenario 1: DataSet, SqlCeAdapter
scenario 2: SqlCeDataReader
scenario 3-a: SqlCeResultSet, options None
scenario 3-b: SqlCeResultSet, options Insensitive
scenario 3-c: SqlCeResultSet, options Sensitive,Updatable, Scrollable

All scenarios were played using the following Sql statement:
SELECT id,data1,data2,data3 FROM test1

There is a significant difference between the first scenario and the rest ones. Scenario 1 fetches all rows into a DataSet. The rest fetch
rows into object instances defined like:
public class BObject
{
public int id;
public string data1;
public datetime data2;
public double data3;
}

The fetching code was like:
List<BObject> boList = new List<BObject>();
...
IDataReader reader = cmd.ExecuteReader();
...
while(reader.Read())
{
BObject bo = new BObject();
bo.id = reader.GetInt32(0);
bo.data1 = reader.GetString(1);
bo.data2 = reader.GetDateTime(2);
bo.data3 = reader.GetDouble(3);
boList.Add(bo);
}

Every scenario was executed 5 times and execution time and total memory reported by GC were written down. The application was restarted before each scenario test run. The results are as follows:
Scenario 1:DataSet
#Time[ms]Memory[bytes]
1101122096016
286731930176
385922189796
486562096160
586541930176

Scenario 2: SqlCeDataReader
#Time[ms]Memory[bytes]
1506952548
234531051124
334941151748
43414850952
53639953004

Scenario 3-a:SqlCeResultSet,None
#Time[ms]Memory[bytes]
15368952536
239041051156
339021151692
43964850896
53875952992

Scenario 3-b:SqlCeResultSet,Insensitive
#Time[ms]Memory[bytes]
112238952536
2108481051156
3108261151692
410917850896
510877952992

Scenario 3-c:SqlCeResultSet,Scrollable,Updatable,Sensitive
#Time[ms]Memory[bytes]
19928952536
282661051156
384441151692
48538850896
58460952992

What is obvious from first sight is that SqlCeResultSet has different performance characteristics depending on the options used. Also you may see that DataSet is on the first place in memory consumed. One interesting figure is the longer time neededd to execute test run 1 and that's because of the JIT compiling proces on the first code hit.It seems from this tests that using SqlCeReder should be preferred in strict fetching scenarios, although perf differenceses from SqlCeResultSet(None) are not quite meaningful. However, if one needs scrolling and updating, SqlCeResultSet is the best choice.Unfortunately, I have not written down the perf. status of the system componentsts like SqlCe memory, total system memory used ,etc.This will help getting better picture of the data access pros and cons.

Wednesday, November 09, 2005

VS2003-To-VS2005 Solution conversion failure

I tried to convert my VS2003 SmartDevice solution to VS2005. Unfortunately, the conversion wizard faled with message "Compact Framework 1.0 is not installed."I was in a hurry so decided just to solve the problem.( I have CF 1.0 installed, actually!I was even able to create and run CF1.0 solution with VS2003 in order to see if it still works).Anyway I found the following resolution:

After conversion falure you will see your VS2005 solution having all projects marked as "Unavailable".
In order to fix the problem perform the following steps:

  1. Click the right mouse button over the project in Solution Explorer. Select"Edit xxxxxx.csproj" - it opens the project file source.
  2. Locate tag TargetFrameworkVersion and change tag content from "v1.0" to "v.2.0"3.
  3. Click the right mouse over the project in Solution Explorer again and select "Reload project." Conversion wizard will open again and you will have your project in VS2005 format.
    This does not mean you will be able to compile it. You may have to spent some more time to solve some other issues :(


Compact Framework Performance Hints

Performance is a big pain in the CF world. Here comes some
usefull general performance hints when using CF 1.0/2.0.

1. Avoid virtual calls when not needed.
Virtual calls are slower that instance and static calls. CF 1.0 virtual calls are interpreted at runtime due to the working set cost(vtables are not used). This cause about 40% slower performing virtual calls compared to statis and instance calls. Interpreted virtual calls are cached in a fixed length cache. CF 2.0 uses a variable size cache which leads to nearly 100% hit rate. Anyway, avoid using virtual calls if not needed.

2. Avoid properties
Properties are slower than field. It may happen compiler to inline property getters and setters but do not dont trust it. CF 2.0 propose better performance on virtual properties due the upper mentioned changes.

3. Avoid PInvoke.
It is up to ~6 times more expensive than managed instance calls.

4.Override Equals() and GetHashCode()
Built in behavior of Equals() and GetHashCode() uses reflection, which leads to extra performance impact. Create your own optimized implementations.

5. Explicit string intern.
It may help optimizing performance, however do not use it ( and trust the builtin interning mechanism)unless you are not sure that it helps. Intern process may bring extra performance penalties if not used properly

6.Method inlining.
Although, it is not easy to predict if the JIT compiler will inline particular method, it may be beneficial trying to make the method a candidate for inlining:
Method must have:
- 16 bytes of IL or less
- No branching
- No local vars
- No exception handlers
- No 32 nit floating argument ot ret values
- Using of arguments in their declaration order
And do not use debugger:)

7. Enregistration
CF 2.0 introduces enregistration- a process of storing 32 bit sized (locals and arguments) into the CPU registers. This leads to significant performance improvement however you should stick with 32 bit sized variables. Less sized variable may not be as efficient as 32 bit due to additional
conversion issues.

8.Garbage Collection
In general GC is expensive process from a performance point of view.
Try to avoid creation of too mach managed objects - for example try to avoid boxing/unboxing operations, poorly written string manipulations, etc. In order to diagnose GC issues lookup "GC Latency Time" perf. counter in mscoree.stat Cf 2.0 allows runtime perf log emit, which may help to better diagnostic/monitoring solutions.

9.Use ThreadPool
If you have a big number of short living async workers using of ThreadPool will help application to perform better.

10.Parsing DateTime
Use DateTime.ParseExact method to parse DateTime.Otherwise a variety of culture specific conversions will be applied until the right one is found.

11.Some common hints
- Avoid iterators - use indexers
- Pre-resize collections
- Avoid boxzing/unboxing
- Use specialized classes for string operations
- Use typed collections for best performance. Extensive generics
usage may increase JIT pressure, which may cause performance penalties depending on the number of the types/generics defined combinations.

12. Xml
-Use XMLReader and XmlWriter for Xml processing
-Don't use schema parsing unless you must.
-Design shorter documents: strip white spaces,use attributes
-Use Skip(), where possible - it is faster than Read()
-Use factory classes XMLReader/XMLWriter to create a proper optimized reader/writer
-Use XmlReaderSettings and XmlWriterSettings classes to get
better optimized readers/writers
-Avoid Windows codepage encodings - Use UTF8 or ASCII
-Create single XmlSerializer per type and cache it for next use.
XmlSerializer serializier creation is expensive

13. Datasets
-Use typed DataSets
-Avoid DateTime columns - use Ticks instead.
-Avoid storing Datasets as Xml - store schema as well if it may not be avoided.
-Use schema, when DataSet loading is done from Xml
-Map columns as attributes when loading DataSet from Xml

14. Data
- Avoid DataSets
- Use DataReader and Sql Server Ce
- Dispose SqlCeCommand and DataReaders

15.WebServices
-Create single WebService proxy instance and use it during the whole application
life cycle in order to avoid first-hit proxy reflection.
-Avoid sending DataSets across the network
- Use DiffGrams if DataSets usage is required.

17. Avoid reflection
Although reflection may give applications flexibility it is quite resource-hungry feature.
Do not use it if possible.

18. GUI
- Load and cache Forms in background
- Do not populate data on Form.Show(). Do it async
- Use SuspendLayout/Resume Layout, when repositioning controls
- Be carefull, when using events
- Use background processing

Details may be found on .NET Compact Framework Team's Blog

Tuesday, November 08, 2005

Sofia .NET User Group Meeting

The next Sofia .NET User Group meeting is going to be on 12.Nov.2005/11.00h at FMI (Sofia's University). The event is organized together with BARS
This time it will be a whole day event dedicated to the VS2005 launch.
We will have 3 great presenters and they will going to talk about :

No doubt it will be a great event!

Tuesday, November 01, 2005

NavMobile - pocket size ERP

We are going to present our new mobile product at BAIT Expo 2005
The product is a result of the joined efforts of RITSoftware and Intelligent Systems.
It is called NavMobile and acts as a ERP mobile front-end and targets multiple ERP systems.

The platform is build over Compact Framework on the mobile side and targets Pocket PC devices. It provides instrumentation for rapid on site customization, smart synchronization, remote monitoring and configuration and more.

Everybody may meet us between 1 and 6.Nov.2005 at Inter Expo Center-Sofia, hall 3,3B10

Stay tuned for more details ...

Monday, October 17, 2005

Bring Reporting Services back to life

Few weeks ago our test reporting services installation was broken after extensive "DROP" statements usage over and over again :). This is a test server after all.
So, reporting services site started returning the following error:
An unexpected error occurred in Report Processing. (rsInternalError) Get Online Help Database 'ReportServerTempDB'
cannot be opened because it is offline.


Fortunately,there were good news: I still had our reports intact. However, I couldn't use them. After an hour or so the decision came to my mind:

I've performed the following steps:
1. First I created an empty ReportServerTempDB database by using MSSQL Enterprise Manager.
2. I created a role named: RSExecRole in ReportServerTempDB database and added RS user in this role. Hopefully, this was my current user also.
3. I opened ReportServices installation folder (C:\Program Files\Microsoft SQL Server\MSSQL\Reporting Services\ReportServer)...
4. and executed CatalogTempDB.sql over ReportServerTempDB

and our reporting services installation is still online :)

Sunday, October 16, 2005

Compact Framework Presentation

I had another presentation during the monthly meeting of Sofia .NET User Group on 13.Oct.2005.
The presentation was actually presented by me and Tihomir Ignatov, iFD Engineering GmbH.
Once again I presented too mach theory and less real coding action - I think this approach bored our audience.
I hope to have the opportunity to change this next time...

Friday, October 14, 2005

Reading news with Outlook Express..

I found a strange behavior (it may be even a feature!) of MS Outlook Express, while trying to read some newsgroups. Actually, I have been observing this strange effect for the last few weeks. The effect was reproduced on each of the machines, that I usually work. This was true even after Windows re-installation.

When I click over the newsgroup, that I want to read(it is usually displayed in the treeview on the left)
the right pane displays the list of the postings. OE starts retrieving new posts and then suddenly the posts list disappears and the list of the subscribed newsgroups appears.
This actually prevent me from reading news.Going to offline mode resolves this issue partially.

What a relief I felt, when I found the cause.
Actually a few weeks ago I start using a new way to launch OE:
   1.Start Menu/Run
   2.type "news://"
and ops we have a problem!

If I start OE by using a shortcut, (or by pointing "msimn.exe" from Start/Run/Browse), this effect is not oberved.


Monday, October 10, 2005

Compact Framework 2.0 goodies

Following a list of the new features in Compact Framework 2.0:

  • COM Interoperability (RCW)

  • Anonymous methods

  • Generics

  • Partial classes

  • Xml ( Performance improvements,XmlSerialzier,Xpath,Schema)

  • NTLM, Negotiate and Kerberos authentication protocols

  • IPv6 support

  • SOAP 1.2 support

  • System.Messaging namespace

  • RegistryKey class

  • Serial port support

  • Improved cryptographic support

  • Imrpoved threading

  • System.Drawing improvements

  • Addditional WinForms Features and Controls

  • Design time form inheritance

  • Design time custom controls

  • Direct3D and DirectDraw Mobile

  • Pocket Outlook Managed Library

  • Telephony Managed Library

  • Location Services support

  • Runtime hosting

  • Enhanced Type Marshaling

  • Notification Broker

  • Simplified Asynchronous Web Services programming model
  • How compact is CF.NET

    See this interesting figures

    Palm and Microsoft announced a partnership

    Palm and Microsoft announced a partnersip, so we are waiting to see the first Palm mobile device powered by Windows Mobile 5.
    More details...

    Why Can't I Upgrade

    Windows Mobile 5 is out there. A lot of people want to upgrade, but they can't. Why?
    Here comes the answer

    Thursday, October 06, 2005

    Sunday, September 04, 2005

    Security Guidance for .NET Framework 2.0

    A Security Guidance for .NET Framework 2.0 page contains resources about building secure applications with .NET 2.0

    Thursday, August 11, 2005

    Eric's Advice For First-Time Technical Presenters

    Some useful tips for first-time presenters may be found here

    Monday, July 18, 2005

    C# Code Generation with .NET - part 2

       This post is coming to wrap up the classes contained into the System.CodeDom namespace and the code snipsets from my previous post into a simple archtiecture for generating C# classes from Xml content.
    Lets have the following class diagram:



    Class XmlCodeGenerator provides a public method Generate.
    It has the following declaration:



    public class XmlCodeGenerator:IXmlCodeGenerator
    {
       public IList Generate(XmlDocument xmlToMap, string targetFolder,string _namespace)
       {
    ....


    XmlDocument contains the xml content, which should be processed.
    targetFolder is the folder, where the source files will be generated.
    _namespace is the root namespace for all new classes.
    The method returns a list of the class names generated from the xml file.

    The core of the code generation is implemented into the XmlNodeToClassMapper
    descendant class.
    XmlNodeToClassMapper is an abstract class and contains one abstract method:



    public abstract class XmlNodeToClassMapper
    {
    public abstract System.CodeDom.CodeObject Map(XmlNode nodeToMap,params object[] parameters);
    }


       The XmlCodeGenerator.Generate method iterates over the xml elements and
    invokes the Map method of a ClassMapper instance for every xml element,
    which have child xml elements or xm attributes. Xml elements without child elements or
    xml attributes are mapped to members of the System.String type.

       The VS2003 solution for this sample may be downloaded here.
    It contains the XmlCodeGenerator implementation, Unit Tests and sample Windows Application.

       Note that this sample is not meant to be a complete code generation solution. It is intended to illustrate
    some of the .NET features, which may be used in code generation scenarios...

    Sunday, July 17, 2005

    C# Code Generation with .NET - part 1

    Code generation tools are already standard equipment for the
    developers these days.
    The main motivation to use code generation techniques is that
    most of theprojects share similar development tasks. It is
    possibleto automate the execution these tasks by using special
    tools -code generators.
    The sample proposed here demonstrates some .NET features
    for language independent code generation. The sample its self
    should not be considered as a complete code generator
    architecture.The development scenario, automated in this
    sample is whenthe developer have a sample xml file, which
    should be deserializedas a native .NET class , processed
    somehow and serialized back to xml content.
    The sample implements a class called XmlCodeGenerator,
    which generates C# source files from given Xml file.Each
    generated source file contains a C# class for each unique xml
    element contained in the xml file. It automatically recognize if
    the xml element should be mapped to as class or class member.
    Due to the missing information about the actual data types all
    xml elements mapped to simlpe class members are implemented
    as System.String. Generated classes may be compiled and used
    with System.Xml.Serialization.XmlSerializer class

    If we have the following xml file:


    <?xml version="1.0" encoding="utf-8" ?>
    <employeeList>
    <employee memberOf="Managers">
    <names>Peter</names>
    <birthDate>1.1.1967</birthDate>
    <contact>
    <town>Sofia</town>
    <address>1000 Mladost</address>
    <phone>0887435234</phone>
    </contact>
    </employee>
    </employeeList>


    XmlCodeGenerator will map the xml content to 3 classes in the
    following way:

    Xml element.NET class
    employeeListEmployeeList
    employeeEmployee
    contactContact


    The rest of the elements and attributes within the xml content
    will be mapped as class members

    This core functionality is wrapped around the System.CodeDom
    name space. In general it contains classes that can be used to
    represent the elements and structure of a source code document.
    In order to create a .NET class declaration for the root xml element
    "employeeList" we will have the following code:



    uses Microsoft.CSharp;
    uses System.CodeDom;
    uses System.CodeDom.Compiler;
    CodeTypeDeclaration classDeclaration = new CodeTypeDeclaration("EmployeeList");
    classDeclaration.IsClass = true;
    //make it public
    constructor.Attributes= MemberAttributes.Public;
    //add constructor
    CodeConstructor constructor =
    new CodeConstructor();
    classDeclaration.Members.Add(constructor);
    //add class level XmlRoot attribute
    classDeclaration.CustomAttributes.Add(new CodeAttributeDeclaration("System.Xml.Serialization.XmlRoot",new CodeAttributeArgument(new CodePrimitiveExpression("employeeList"))));



    Next, we should add some class members. In our case
    "employeeList" element has one sub-element named
    "emlpoyee". It will appear as a class member called Employee
    of type Employee. The following code may be used to create
    this class member:



    uses Microsoft.CSharp;
    uses System.CodeDom;
    uses System.CodeDom.Compiler;
    //create field
    CodeMemberField field = new CodeMemberField("Employee","m_employee");
    //embed field into the class declaration
    classDeclaration.Members.Add(field)
    //create property
    CodeMemberProperty property = new CodeMemberProperty();
    property.Name = "Employee";
    property.Type = new CodeTypeReference("Employee");
    property.Attributes = MemberAttributes.Public;
    //add XmlElement attribute for serialziation support
    property.CustomAttributes.Add(new CodeAttributeDeclaration("System.Xml.Serialization.XmlElement",new CodeAttributeArgument(new CodePrimitiveExpression("employee"))));
    property.HasGet = true;
    property.HasSet = true;
    property.GetStatements.Add(new CodeMethodReturnStatement(new CodeFieldReferenceExpression(new CodeThisReferenceExpression(),field.Name)));
    property.SetStatements.Add(new CodeAssignStatement(new CodeFieldReferenceExpression(new CodeThisReferenceExpression(),field.Name), new CodePropertySetValueReferenceExpression()));
    //embed property into the class declaration
    classDeclaration.Members.Add(property)




    Having this class declaration we have to generate the C# code.
    We may do it by using the following code:


    uses Microsoft.CSharp;
    uses System.CodeDom;
    uses System.CodeDom.Compiler;
    //obtain C# code provider
    CSharpCodeProvider provider = new CSharpCodeProvider();
    //get the proper code generator
    .ICodeGenerator generator = provider.CreateGenerator();
    //create namespace to place our class declaration into
    CodeNamespace _namespace = new CodeNamespace(Namespace);
    _namespace.Types.Add((CodeTypeDeclaration)classDeclaration);
    //generate code
    CodeGeneratorOptions options = new CodeGeneratorOptions();
    options.BlankLinesBetweenMembers = true;
    System.IO.StreamWriter sw = System.IO.File.CreateText("D:\\work\\employeeList.cs");
    generator.GenerateCodeFromNamespace(_namespace,sw,options)
    sw.Close();



    As you may see having this type of class declaration our code
    generator is not limited to C# code generation only.

    Let's wrap up these code snipsets...To be continued

    Monday, July 11, 2005

    RSS in Longhorn

    More info about the announced support for RSS in Longhorn may be found here

    The Longhorn RSS platform consists of three parts:

  • RSS Feed List
    Provide access to the feeds to which the user is subscribed
  • RSS Data Store
    Provide access through an object model to the downloaded feeds and enclosures
  • RSS Platform Sync Engine
    Service, which automatically downloads feed content and enclosures

  • Hopefully, the platform architecture is designed to be easy accessible from RSS enabled application.

    Some demos may be found here

    Tuesday, July 05, 2005

    The Open Source Heretic

    Which one is better : Open-source or traditional software business model?

    Some interesting thoughts may arise if you read The Open Source Heretic
    article form Larry McVoy

    Design Patterns used in .NET

    I found another interesting article via the Moth

    Design Patterns used in the framework

    Friday, July 01, 2005

    CF vs Full .NET

    Another good document about the differences between CF and Full .NET Framework may be found here

    The following article gives some good advices for Writing High-Performance Managed Applications
    These hints may be used in the CF as well.

    Compact Framework Architecture

    I found an interesting document discussing internal differences between the full .NET framework an the Compact Framework.

    It may be found here - Compact Framework Architecture

    Friday, April 29, 2005

    Unit Test Presentation again

    I had another Unit Test Presentation at BASD.
    This time my presentation was focused over the theoretical aspects concerning UnitTesting, TDD and Continues Integration.

    Friday, April 22, 2005

    Refactoring thumbnails

    While swimming into the Test Driven Development infosphere, I found this interesting site dedicated to Refactoring
    One of the most interesting part of the site is the refactoring thumbnails section

    Tuesday, April 19, 2005

    Agile software development

    A good reading about agile software development techniques may be found on AgileKiwi

    Thursday, April 07, 2005

    Service Locator Unit Test

    Required reading Service Locator pattern and .NET post.

    What about writing unit tests for our Service Locator?
    Actually they may look quite straight forward. It is even esier, because we dont have to deal with a complicated service instantiation code.

    Having the sample implementation from the prev. post, we may write the following code in order to test our consumer class. Actually some changes are needed in order to use the code in unit testing. Sample code may be obtained from here

    [TestFixture]
    public class ServiceConsumerTest
    {
    ServiceConsumer consumer;
    [SetUp]
    public void SetUp()
    {
    consumer = new ServiceConsumer();
    }
    [TearDown]
    public void TearDown()
    {
    consumer=null;
    }


    [Test]
    [ExpectedException(typeof(ApplicationException))]
    //we expect our consumer to raise exception if not in container
    public void NotContainedConsumer()
    {
    consumer.Process();
    }

    [Test]
    public void ContainedConsumer()
    {
    //embed the consumer into our container
    ConfigurableContainer container = new ConfigurableContainer();
    container.Add(consumer);

    //prepare App configuration - we may test our container with various services
    imlpementations and even mocked services

    //call processing

    consumer.Process();

    //check expected result....and assert
    }
    }

    In order to have this test working with VS2003, we need to add the following line in
    menu Project/Properties/Build Events/Post Build Event Command Line

    copy "$(ProjectDir)App.config" "$(TargetPath).config"


    This will force VS2003 to copy our configuration file into the output directory, thus making it available for NUnit and our tests.

    Of course, we may mangle the configuration file in our test in order to perform tests with other service implementations and even mocked services.
    Better aproach could be to provide API for container configuration.

    Wednesday, April 06, 2005

    Service Locator pattern and .NET

    Introduction
    One way to break the dependency between 2 classes is to use Service Locator pattern. Recently I had a post about Dependency Injection and .NET. I pointed out, that one of the ways to break the depndency between 2 classes is to use the built-in capabilities of the .NET framework to create lightweight containers. Let us try to implement a simple container using the classes and interfaces from the System.ComponentModel namespace

    Implementation
    Sample source code may be downloaded from here
    First, we have to define our problem. Lets say, we need to imlpement a class to process some data. This class is named ServiceConsumer. We want our ServiceConsumer to have the ability to consume data from diffeent sources like emails and files. In order to decouple our ServiceConsumer class from the actual source data fetching, we will declare an interface called
    ICommonService
    . We will imlpement two versions of ICommonService called EmailService and
    FileService
    .


    The ServiceConsumer class imlpementation looks like this:

    1  public class ServiceConsumer:Component
    2 {
    3 public ServiceConsumer():base()
    4 {
    5 }
    6 public void Process()
    7 {
    8 ICommonService service=
    9 (ICommonService)GetService(typeof(ICommonService));
    10 service.Execute();
    11 }
    12 }
    • ServiceConsumer inherits System.ComponentModel.Component. This way our Consumer may be embeded into containers.
    • Line 8 invoke Component.GetService method in order to obtain implementation of ICommonService - note that our consumer class does not depend on the actual service implementation

    Now, we want to write the code to instantiate our actual service implementation. We'll do it smarter by using the configuration namesapce of the .NET framework. First, we will create our application xml configuration file:

    <configuration>
    <appSettings>
    <add key ="ServiceLocatorSample.ICommonService"
    value="ServiceLocatorSample.EmailService" />
    </appSettings>
    </configuration>

    Following the implementation of our container. In the sake of simplicity our container contains the code to deal with the actual service implementation. It reads our configuration file and create the proper service instance.

    1 public class ConfigurableContainer:Container

    2 {
    3 public ConfigurableContainer():base()
    4 {
    5 }
    6 protected override object GetService(Type service)
    7 {
    8 string implementationTypeName =
    9 ConfigurationSettings.AppSettings[service.FullName];
    10 Type actualType = Type.GetType(implementationTypeName);
    11 if(actualType!=null)
    11 {
    12 return Activator.CreateInstance(actualType);
    13 }
    14 return base.GetService(service);
    15 }
    16 }
    • Our ConfigurableContainer inherits System.ComponentModel.Container
    • Lines 8 and 9 fetch the service implementation type name from the configuration file
    • Line 12 creates the service instance
    • In real world scenario, we should have additional exception handling code
    And finally, we will wrap up everything:
    1  string consumerName = "consumer1";
    2 containter = new ConfigurableContainer();
    3 ServiceConsumer consumer = new ServiceConsumer();
    4 containter.Add(consumer,consumerName);
    5 ServiceConsumer myConsumer =
    6 ((ServiceConsumer)containter.Components[consumerName])
    7 myConsumer.Process();

    • line 2 creates an instance of the container
    • line 3 creates an instance of the Service Consumer and adds it to the
      containers' components list. This way our component is automatically binded to
      the containers service instantiation code
    • lines 5 and 6 demonstrate the usage of the Service Consumer instance.
    • Note, that this code snipset is not depenedent on the actual service implementation

    Conclusion
    What are the benefits of using lightweight containers in .NET :

    • Maximize class decoupling, e.g. better design
    • Easier code maintainance -if we want to add additional service implementation,
      we need to implement a new class and modify our configuration file.
    • Maximize reusability - our component(ServiceConsumer) may be used by other
      developers and they don't need to know detaisl about the service creation
      process.
    • Creating complex agile plug-in archtiectures based on proved standards is
      eaiser.
    This sample is to demonstrate a basic way to implement lightweight container in .NET. In a real world scenario, we may have more complex container architecture. We may use layered containers(implementing a chain of responsibility) , service containers,more complex configuration, etc.

    Unit testing
    To be continued...

    Dependency Injection and .NET

    Loose Coupling
    When reading about software design, one will obviously meet the magic words loose coupling. This rather abstract concept is often considered as a metric for code quality and testability.

    Besides the pure theoretical speculations, there are many significant works, which light up some practical aspects of the loose coupling concept. One should read the Inversion of Control Containers and the Dependency Injection pattern by Martin Fowler 'cause his work gives some great ideas about the dependency breaking problem.

    .NET and Inversion of Control
    What are the ways to break the dependency in .NET in elegant&efficient manner?

    • Implementing a form of Dependency Injection from scratch.
    • Using artifacts from the System.ComponentModel namespace to create lightweight containers . This form of implementation corresponds to the Service Locator pattern
    • Using third partry like Spring.NET. It supports different forms of Dependency Injection

    However the right aproach should be evaluated in a particular context. The impact over the code testability (the practical aspect) should be evaluated for example.

    Tuesday, April 05, 2005

    Short presentation of Unit Testing and .NET

    I had a short presentation of Unit Testing and .NET on 30.03.2005 during the monthly metting of SofiaDev.org
    It turns out my presentation skills are not quite impressive.
    Fortunatelly, the google have an answer again - LILSeminaras.com
    They have some free presentation tips. However, I suppose the real practice is the best option.


    I like the idea to use my blog as a dairy.
    Well not a detailed one, but it may give me one day a clue about my personal and professional wandering

    Monday, April 04, 2005

    Continuous Integration Again

    I was digging into the Continuous Integration theme and fall on the Martin Fowler's Continuous Integration article again - a classic in this field.

    And some thoughts came to my mind...
    I've always been an extreme fan of the development process automation. I think it may be beneficial for an organizations to automate most of the activities during the development process. This may be one of the ways to transform current software handcrafting into a real industry.

    Why automate:
    A modern approach in software development activities is the iteration based development. It means that your project steps through a number of iterations. Each iteration, the team adds new features and fixes defects, injected in the previous ones. A key aspect of this aproach is that each iteration includes a lot of similar activities. These activities are vital for the project. However, the team should spend some time for managing, performing and verifying the results from the activities and it costs money. The time and money for these activities are spend every iteration. However there is one other problem - people’s ability to make mistakes. Mistakes may manifest them selves later, often when it’s too late. Some methods to mitigate the risk of mistakes are people's training, personal and group reviews, etc. Such an approach may lower the probability of failing due to mistakes or lack of skills , however it is pretty expensive and also tend to be a constant expense. And while training and reviews will not be avoided another weapon may be added to the organization’s armoury - development process automation techniques.

    The benefit from automation

    • Avoid people mistakes
    • Allow experts to focus on unique activities, thus decreasing the total production time
    • Decreases the total development costs and increases the quality of the product


    Some ideas of what may be automated:

    Source code generation
  • Source code guidelines conformance checking
  • Source version control activities
  • Building/Packaging
  • Distribution Code level tests
  • System-wide tests
  • Team communication

  • Note that the list above only scratches the surface of what may be automated. It also may turns out that some activities may not be automated in a particular context. The development process automation techniques should be adopted with caution and understanding of the possible drawbacks.

    Some velocity gaining concept lately, which concerns some aspects of the upper mentioned ideas is the Software Factories term

    Friday, April 01, 2005

    Unit tests and the emerging characteristics phenomenon

    The scientific theories about the complex systems like weather, biological brains, social groups and ant colonies often talk about the emerging characteristics phenomenon. The complex systems tend to demonstrate a complex structure and behavior. It is usually difficult ( or even impossible) for humans to map the observed system behavior to a number of low-level intra-system interactions.

    We may have a good knowledge about the characteristics of the simple
    building components of the system and about the observed behavior of the system as whole, but it’s difficult to analyze the dynamics. The large number of simple components interactions within the system forms the complexity of the system dynamics. The system demonstrates behavior characteristics, which are far beyond the characteristics of a simple building component.

    Software systems may be considered as such complex systems. Although much simpler than the human brain, there are software systems, which are complex enough to be analyzed by a single man as a whole. The standard way to guarantee the quality of the software systems today is to perform post-quality control activities. It usually involves the QA division members, who perform a number of manual or automated tests upon the system as a whole. The effects observed are classified as expected or defects. Then the defects should be analyzed and mapped to specific simpler building components in order to be fixed. It is often not trivial to say if the defect observation is due to static or collaboration issue, if this is a design or coding issue. It is often difficult to find the exact problematic code section. And what is always valid is that this process tends to be slow, inefficient and expensive. Organization, often do not try to analyze the defect injection reasons. They are not trying to optimize the development process and to minimize the cost of the post-control activities. However this is another story.

    My point of view is that most of the post-quality control procedures, which act upon the system as a whole, can’t on their own output the quality that we want. A major reason is the complexity and the emergency characteristics phenomenon. The integrated system has a complexity far beyond its simpler building components. What, if we add another approach to our currently used defects-prevention methods. It will be beneficial trying to decrease the defects injection rate before the system integration. It is just much easier to control the quality of the simplest system component. And we have a good candidate tools. Unit tests theory is a good way to control the quality of the simplest system components. It’s even better - unit tests focus the developer over the vital component usage issue thus helping to produce better low-level design.

    A major argument against using unit testing is the additional development overhead. After all it costs money and time. The first answer that comes to mind is “The quality costs money”. Unfortunately, this answer will not convince managers to invest a bunch of money in another quality- control methodology. However, it is not quite true that unit testing increases the production cost .Unit testing actually will decreases the total cost of the production cycle by reducing the cost of the refactoring/changes, reducing the defect-injection rate thus making the post-quality control activities cheaper. It makes developers feel sure about their code thus motivating them to innovate.

    Saturday, March 12, 2005

    MVC architectural pattern in ASP.NET


    The MVC pattern in general
    When talking about presentation framework, it seems that the most popular architectural pattern used is the MVC (Model-View-Controller). The model stands for the application logic - e.g. this is where data structures are manipulated, database and business components are accessed. The View and Controller stands for the user interface of the framework. Conceptually the user interface may be considered as a combination of input and output components. In the MVC case the View and the Controller may be considered as these output and input components. It is very true in HTML/HTTP based environments. The Controller receives the user requests (serves as input) and the View outputs content to the user. This pattern enforces the actual representation (View) to be separated from the application logic (Model).
    In the Java community MVC stands also for the original MVC implementation (by using JSP for the controller/view implementation and JavaBean for the Model implementation) which brakes a little bit the idea proposed by the MVC pattern. Later Sun proposed second implementation of the MVC pattern, called MVC2.

    The MVC pattern from the ASP/NET tower
    Going down from the abstract patterns and closer to the implementation details, we will see that ASP.NET uses an implementation of the MVC called Page Controller. (one may argue if the Page controller should be consider as an architectural pattern or implementation aproach, however it depends on the goals of the analysis). Unfortunately it is not quite useful aproach, when dealing with large complicated applications. I talk about the cases, when we have requirements for a complicated workflow management, aggregation of multiple and heterogeneous backend resources, end-user device specific content rendering, long application lifecycle and frequent business model changes.
    In this case another MVC implementation approach is more appropriate. It is called Front Controller. ASP.NET does not propose this kind of MVC implementation, however it provides the tools to build such a framework.
    A suitable aproach for implementing a Front Controller is to implement use IHttpHandler interface implementation for controller. The IHttpHandler interface implementations are "plugged-in" the ASP.NET request processing pipeline. They are very clear way to imlpement the Controller pattern.
    Here comes an example of a Front Controller implementation
    The proposed implementation is a good start playing with a front controller implementation. However it uses *.aspx files to represent a single view. This aproach may introduce quite heavy development effort if we have a large number of views to implement.
    And beacause in this case, we will have longer and complicated development period it is more probable to inject more defects in the final solution. One way to mitigate this risk is to use alternative View implementation.
    The .NET framework introduces the System.Xml namespace. It implements a large number of classes for Xml manipulation.
    This is where we may search for an elegant solution. We'll need to acomplish the following steps:
    1. Design custom Xml based declarative language to define our views - Xml schema. We call this Xml View Definition
    2. Design and implement Xml View Definition Processor. It may be composed of 2 parts: Xml View Transformation (xslt) and code-based processor. In order to compose our view in ASP.NET format we will execute the following steps:
          2.1. *Convert the Xml View Definition file by using our Xml View Transformation and receive html content.
          2.2. Pass the html content to the Page.ParseControl method and add the list of the controls to the Page.Controls collection
          2.3 **Pass the Xml View Definition to the code-based processor in order to handle additional processing - for example Event Binding,Data Binding...,
    * We may have a separate Xml View Transformation for every type of end-user device: IE,Netscape, PDA, smart Phone, e.g.
       and why not XAML
    **A lot of work may be done by the Xml Transform Processor by using the XsltArgumentList.AddExtensionObject method to add
       references to .NET classes. These classes may be invoked by the transformation processor during the Xslt transformation        process. This way, we may avoid the need from a separate code-based processor.

    Note that this only scratches the complexities of implementing such a framework. In order to have fully declarative views definition it will be beneficial to develop a declarative workflow definition also. This workflow definition should be processed by the Controller. However the benefit from using such a declarative aproach for a View definition may reduce the resources needed to maintain/update the solution in a rapid changing business environment. Of course these aproach has its limitation so pros and cons should be estimated against every particular project.

    Following some common pros and cons:
    Pros:
    * The declarative views/workflow definition eliminates the need of a complex development knowledge for adding new views to the solution
    * The Xml View definitions may give opportunity to support multiple end-user device types.
    * Possibility for easier changes in the workflow and views in order to respond to business changes

    Cons:
    * Greater development impact during the general framework implementation. It may be not suitable for every project.
    * The workflow/view maintainability comes with a performance cost. This may be compensated (a bit ) by utilizing a good caching scheme.
    * Implementing a good post back handling is a challenge
    * It may be chalenging to embed third party ASP.NET based Web Controls.







    Sunday, March 06, 2005

    Job Performance and Technology

    I've found the following figures, which describes a terrible picture of what is happening on the tech scene. One should be noted that it is may be half of the truth. However it should make us think about the software development processes and the quality of the proffessionals...

    "52% of all innovative projects fail and 31% percent of these projects are canceled before producing a single deliverable (Kapur, 1997)."

    "On average, professional coders make 100 to 150 errors in every thousand lines of code they write, according to a multiyear study of 13,000 programs by Humphrey of Carnegie Mellon" and although "(s)ystems testing goes on for about half the process and even when they finally get it to work, there is still no design" (Mann, 2002)."

    "In the last 15 years alone, software defects have wrecked a ($500 billion) European satellite launch, delayed the opening of the hugely expensive Denver airport for a year, destroyed a NASA Mars mission, killed four marines in a helicopter crash, induced a U.S. Navy ship to destroy a civilian airliner and shut down ambulance systems in London, leading to as many as 30 deaths" while the I Love You virus enabled by Microsoft's decision to allow Outlook to easily run programs in e-mail attachments cost $8.74 billion according to consulting firm Computer Economics (Mann, 2002).

    "According to a study by the Standish Group' software projects often devote 80 percent of their budgets to repairing flaws they themselves produced a figure that does not include the even more costly process of furnishing product support and development patches for problems found after the release" (Mann, 2002)."

    Saturday, March 05, 2005

    Post Quality Control in Software Development

    I've read some articles about the Balanced Scorecard (www.balancedscorecard.org) management concept. It is quite new management idea developed in the early 1990's by Drs. Robert Kaplan (Harvard Business School) and David Norton. It deals with the performance management concept and here are some thoughts on the software development process, which came to my mind... As it is appear to me more of the methodologies and disciplines intended to deal with the software development process, trust too much on the post quality control. A lot of organizations adopts particular software development methodology hoping to improve their product quality. Often the problem actually comes from the "adoption" process. The organizations rely and spent too much effort on performing post quality control activities. Words like "zero defect" and "defect convergence" became a labels for a software project success. However testing, inspecting and simply doing a defect fixing is just not quite effective. Defect fixing may be quite expensive process:

    • it takes time and the user may not use the solution to do his job until the defect is fixed (if the solution is in production)it takes resources (managers, developers, time, money, etc.) and probably these resources will not be paid back by the customer.
    • it is always a risk to make changes - the later during the life cycle, the bigger is the risk.

    The post quality control in most of the organization has only one goal - defect fixing. This way the organization continuously spends constant resources on defect fixing but never tries to identify the reasons for the defect injection and decrease the quality control cost. Often people think like "We have this post quality control process, so we will design and implement it fast, cause we chasing deadlines and if we have defects we will fix 'em later"- this psychological trend may increase the cost of the post quality control. And as always the most practices and concepts share common basic ideas, which float around the culture space during a particular time frame. My point is that the reason for these trends in the current development methodologies and practices used is that they are inspired by some world-wide management practices used during the time, when the methodologies were published - Total Quality Management (TQM) for example. What we need is a quite new software development paradigm...

    Articles about SOA Messaging Patterns

    I found these 2 articles, which discuss SOA messaging patterns(part one and part two) by Soumen Chatterjee. These are very interesting articles indeed.

    http://msdn.microsoft.com/library/default.asp?url=/library/en-us/dnmaj/html/aj2mpsoarch.asp

    http://msdn.microsoft.com/library/default.asp?url=/library/en-us/dnmaj/html/aj2mpsoarch.asp