Infosys Microsoft Alliance and Solutions blog

« January 2008 | Main | March 2008 »

February 29, 2008

Expression Blend issue with Globalized WPF Application

Recently while supporting a project on converting their WPF application for globalization and localization support, we faced an issue in being able to work with Expression Blend 2 Dec preview. We used the recommended approach for globalization and localization and after doing all that Expression Blend failed to load the user controls in design view.

You would get an error like this - "MissingSatelliteAssemblyException: The satellite asembly named yourassemblyname for fallback culture 'en-US' either could not be found or could no be loaded. This is generally a setup problem. Please consider reisntalling or repariring the application."

This was surprising since the specific assembly was already present and the application as such was running fine. This then had to be an Expression Blend specific issue. Usually assembly load issues are related to the path from where the assembly is loaded. In this case, it hence appeared that Expression Blend was most likely looking at its own installation folder to find these files.

This was confired when copying the contents of en-US (inside of bin/debug or bin/release) to Expression Blend's install folder helped solve the issue. Default install path for Expression Blend 2 is - C:\Program Files\Microsoft Expression\Blend 2 December Preview.

Note that the folder en-US and its content need to be copied. Directly copying the assembly doesn't help. As per the globalization logic, it still looks for a folder by the name en-US. Additionally, after copying the folder and assembly, you need to re-start Expression Blend. It doesn't automatically load the assembly is Expression Blend is already running.

Later I found a similar discussion here and so it does looks like a confirmed bug with Expression Blend 2 Dec Preview edition. Hopefully this will get fixed in next CTP.

February 28, 2008

Common platform for building SOA and distributed Microsoft based applications

Microsoft is trying to regain ground in the SOA space. today the challenges that an enterprise
faces when building SOA based applications are the huge effort required to design, build, deploy and manage these SOA based applications. Microsoft has huge range of products like Windows
communication Foundation(WCF),BizTalk Server and other emerging technolgies like Silverlight
and Biztalk services for Supporting SOA development but that is not sufficient, there should
be techniques which will make building these applications simpler. Modelling is identifed as
one of the main areas which needs be covered as part of the application development. Models
are used by everyone in the team during the project life cycle.The business analyst use it
during the requirement and process documentation. The architects create models for schemas,
Services and high level design, developers create models for rules and workflows.
The issue here is each of these models are in silos. Each team members uses his own tool and
framework for modelling which creates communication barrier and these model live in isolation
and there is no end to end solution view of the entire application from the view of business,
architects and developers of the application. This is where i feel the next version of application
platfrom product will provide a unfied view and and common platform for building SOA and distributed based applications which is named as "OSLO" by Mircosoft.

 

 

 

 

Oslo will be the next version of application platfrom products like named Microsoft Visual Studio “10,”  Microsoft System Center “5,” BizTalk Server “6,” BizTalk Services “1” and Microsoft .NET Framework “4.”

Oslo can be termed as Microsoft investment in the space of  SOA and business process management.This will help in the  development of distrubuted applications. The main areas will be Service enabling and model driven architecture.

The main areas where "OSLo" targets for development are

1. Framework- .NetFramework 3.5 which will have Model driven devlopment using Windows communcation Foundation (WCF) and Worklfow foundation (WF)technologies.
2. Server- BizTalk Server 6 will be enhanced further to develop, deploy and manage distrubuted appliation
3.Service-  Biz Talk Services which will have capabilites hosted messaging , identity and workflow capabilites
4.Tools- Visual Studio 10 will provde deep support for model driven design and deployment of composite application
5.Repository- System Center 5, Visual Studio 10 and Biztalk 6 will utilise a common repository for managing code, version and Models

With the advent of "OSLO" where the CTP is expected in 2008, the client will be able to solve a series of problems. The primary is composite applications which were being built with high cost and complexity can be built by also smaller enterprises with application bulit being feasible and economical.

The advantages are

1. Provides a unified platform for software plus Services: Oslo will help in delivering enterprise class, unified platformfor building SOA and Software as Services applications. With the advent of new cloud service modelling, OSLO the organisations can take advantage of this to provide flexibility to deploy applications in-house ,third party hosted or microsoft hosted.
2. Connects the End to End Life Cycle: With multiple products of microsoft as part of OSLO, this will will enable managing end to end composite systems as a whole system rather an individual pieces
3. Enhance the software development process: “Oslo” will greatly simplify the development of composite applications that can be more easily changed, because the underlying model is the application (without handoffs between people or systems involved in the software development
 life cycle).
4.Integrate with what you have : . “Oslo” lets you build on the existing and familiar investments you’re already made in skills and technology on top of the Microsoft SOA platform, while simultaneously opening up a rich new set of capabilities.
“Oslo” enhances and aligns BizTalk Server and the .NET Framework and provides significant enhancements across a range of SOA infrastructure services such as federated identity, messaging and long running activities. “Oslo” also enables  implified service enablement and composition of your existing legacy or packaged applications through rich interoperability support delivered through adapters,  Web services and Web 2.0 protocols.


I guess there is huge anticipation for the OSLO beta release and the organizations looking for SOA via the Microsoft platform are in for some good times

 

 

 

February 26, 2008

Additional Relational Metadata for artifacts in MOSS

Organizations which select Microsoft Office SharePoint Server 2007 (MOSS 2007) for managing artifacts (like documents, records) often leverage the out of the box (OOTB) features available in MOSS like libraries, OOTB Workflows, version management, workspaces, etc for the same. When it comes to the metadata, often there might be requirement to attach additional relational metadata in addition to the flat/standard metadata offered by MOSS. E.g. Documents/Records may have to be associated with different hierarchies like ABC.doc may need to be part of India zone and US zone drilling down through the structure, so that the document is retrieved when queried on data across both the zones. 

We (myself and one of my collegue Prasana Srinivasan) did a small exercise and would like share the custom approach we took to add additional hierarchical metadata which involved a custom database and ASPX Page. We will have this as a 2 part series.

In the solution we would take a sample document for attaching the hierarchical metadata. Also we have defined a custom DB schema as shown below.

The database can handle one additional metadata. However this can be extended to take care of any number of additional metadata.

Quick one liner description of the tables

Hierarchy Master – Table contains master values of the hierarchical metadata like location, Division, subdivision.

Hierarchy Header – Table contains details about the hierarchy header values like legal entity, location. This table values are populated in the combo box for selection.

Hierarchy Mapping –    Table provides the mapping details of Header with the hierarchy Master. In this table the master values are grouped into a specific header.

Document Master – Table specifies the header values of the Documents.

Document Details – Table Contains the extended properties of the document with respect to the mapping Id from hierarchy Mapping (metadata values updated by the user against the hierarchy master value).

Note: This is just representative schema for the demonstration purpose and users can have a schema as per their requirement

Rest we will see in the next part of the series.

Gupta to .Net migration

I have been working on few opportunites on Gupta to .Net migration. In most of the cases the client was thinking on a complete rewrite from Gupta to .Net 2.0.

During my research on vendors who can provide tools/ Services for the gupta code migration to .Net, i can across a vendor known as Ice Teagroup, which has tools and expertise in the code migration.

ITG provides licensing of the tool as well as services for various phases during the migration project life cycle.

ITG has a tool known as PPJ inventory analyser which helps in the analysis of the exisintg Gupta application.

PPJ Inventory will open your top level applications and all included libraries and collect the following information:

· number of items

· included libraries

· included dynalibs

· external functions

· images and bitmaps

 

PPJ Inventory will also add warnings if certain items of the SAL language are used which are known to generate more or less manual interaction or cannot be ported at all.

This tool needs an Gupta runtime for the tool to run. The tool has an very intutive GUI which requires the gupta code as an input for the report to be generated. The report generated will be in a XML format.

The translation process is straighforward and quite flexible.

Ice Porter™ loads the source SAL application using the CDK, parses all expressions into expression trees, loads and binds all known assemblies (including modules that have already been ported and custom plugins), analyzes all references and relationships, re-generates the application into a CodeDOM structure (XML-Like Code Document Object Model), optimizes the resulting structures, and finally feeds the final CodeDOM to code renderers that generate the final source code in reliable and consistent manner.

Throughout the entire process, the tool generates special events that can be handled by custom-developed translation filters.Translation filters can modify and enhance the translation process at every step.

Our translation technology can be adapted virtually to any requirement.Migration Support Library 

The migration support library is called PPJ Framework™.

All migration solutions have a support library. Some may be better or more extensive than others, but all automated,semi-automated, or manual conversion approaches have one. (If someone from marketing tells you otherwise, don't believe them).

Our library is entirely written in C# and it directly extends native types and native controls.

The only third party component that we use is the FlexGrid .NET, for which we have obtained a OEM license. The database layer in the PPJ Framework is entirely based on ADO.NET and can use any ADO.NET compliant driver.Our library is as thin and modular as possible. The Visual Toolchest implementation is in a separate module, the XSal2  implementation is also in a separate module, and so is the M!Table implementation and the reporting engine. This approach  allows you to deploy only the modules that are needed by the project.

You also get the full C# source code for free, as part of the maintenance agreement.
 
Customizations
 
Using some of the unique Translation Filters technology, we can add any kind of additional processing logic to the translation of the Gupta code and the generation of the new .NET code.

In addition to our standard customizable filters, we can custom develop personalized rules to normalize the UI of the application, extract documentation in a specific format, generate ad-hoc wrappers to fit in a SOA architecure, change the naming convention, and much more. There is almost no limit to what we can do.

Customers can also develop their own translation filters directly in C# or VB.NET.

The cost of the tool is very competative and the services offered are cheaper and are right fit for a automated code migration to .Net technology
 

 

 

 

Webinar on Advanced Collaborative Supply Management

A webinar on current trends and issues of sourcing and procurements in high-tech and manufacturing sectors. This webinar showcases Infosys's Advnaced Collarborative Supply Management (ACSM) Solution.

Interact with experts from Infosys, AMR Research and Microsoft on Wednesday, March 5, 2008 at 1300 EDT.

For more details and registrations http://www.infosys.com/newsroom/events/2008/ACSM-webinar.asp

February 25, 2008

Group By Many/Multiple Criteria using LINQ to SQL (L2S)

Developers may find it annoying for not finding an out of the box query operator in L2S to group by many/multiple criteria, which is used very frequently in T-SQL queries.  No sweat, there is an easy way out, will try illustrate the same here.

SP written in T-SQL

SELECT

                  InventoryYear.InventoryYearID,
                  SUM(ActivityEmission.CanonicalEmissionAmount) AS CanonicalEmissionAmount,
                  ActivityEmission.CanonicalUnitId,
                  ActivityEmission.GasId
            FROM
                  InventoryYear INNER JOIN EntitySUISInventoryYear
            ON
                  InventoryYear.EntityID = @EntityId AND
                  InventoryYear.EntityID = EntitySUISInventoryYear.EntityID INNER JOIN SUIS
            ON
                  SUIS.InventoryTypeID = @InventoryTypeId AND EntitySUISInventoryYear.SUISID = SUIS.SUISID INNER JOIN ActivitySource
            ON
                  ActivitySource.EntityID = InventoryYear.EntityID INNER JOIN ActivityEmission     
           
ON
                  ActivityEmission.ActivitySourceID = ActivitySource.ActivitySourceID INNER JOIN EntitySUISInventoryYear ESIY
            ON
                  ESIY.EntitySUISInventoryYearID = EntitySUISInventoryYear.EntitySUISInventoryYearID AND ESIY.IsComplete = 1
      GROUP BY
            InventoryYear.InventoryYearID,   
           
ActivityEmission.CanonicalUnitId,
            ActivityEmission.GasId

The same can be written in L2S with as follows using Anonymous types:

Let me create a business object called Emission which will store the results retrieved from the query

 public class Emission

    {       

        public int InvId {get;set;}

        public int CanId {get;set;}

        public int GasID {get;set;}

        public decimal? GasAmt {get;set;}       

    }

L2S query to retrieve the results 

 List<Emission> emission = (from i in InventoryYears

              from j in EntitySUISInventoryYears

              from k in ActivityEmissions

              from l in ActivitySources

              from m in SUIs                  

              where  i.EntityID == j.EntityID && i.EntityID == 1 &&

                  m.InventoryTypeID == 1 && j.SUISID == m.SUISID &&

                  l.EntityID == i.EntityID &&

                  k .ActivitySourceID == l.ActivitySourceID &&

                  j.IsComplete == true &&

                  j.EntitySUISInventoryYearID == j.EntitySUISInventoryYearID

                  group k by new {i.InventoryYearID,k.GasID,k.CanonicalUnitID} into grouping

                  select new Emission {InvId=grouping.Key.InventoryYearID,GasID=grouping.Key.GasID,

                      CanId=grouping.Key.CanonicalUnitID,GasAmt = grouping.Sum(c=>c.CanonicalEmissionAmount)}).ToList();

The key here is the way data is grouped. We need to just group the columns (the columns grouped using group by in T-SQL) into a grouping variable (in this case it is grouping). Run a select query on the grouping variable (grouping variable will be of the type IEnumerable<T>, in this case it will be IEnumerable<ActivityEmissions>), to select the columns needed.

Note: The intention here was to create a T-SQL group by multiple criteria equivalent in L2S. I had some trouble using join and hence resorted to where, by default where clause is internally mapped to inner joins. Opinions are welcome to make the L2S query better.

February 20, 2008

Controlling home appliances from outside - “I can be home when I’m not.”

Concerned about conserving energy? need to change the temperature of your air conditioner as you drive back home? Want to notify that you are on vacation and ask your lights turn on and off at specified time?

Now, you can do it through your mobile phones, PDAs or either home-based or office PC.

Its possible. .NETMicroframework and c# make it easy.

 

The Microsoft .NET Micro Framework

The Microsoft .NET Micro Framework combines the reliability and efficiency of managed code with the premier development tools of Microsoft Visual Studio to deliver exceptional productivity for developing embedded applications on small devices. The .NET Micro Framework is supported by a number of ARM7-and ARM9-based processors. Minimum of 256KB RAM and 512K Flash/ROM required for development and deployment,a memory management unit is not mandatory.

The Z-Wave Technology

Z-Wave, which was codified through an industry alliance in 2005, is a wireless radio frequency (RF)-based communications standard that makes remote control effective and practical for homes of any size. The protocol, which is embedded in a microprocessor chip and built into a module or device along with memory – flash memory, RAM, or both – transforms a stand-alone appliance into an intelligent networked device that can be controlled and monitored wirelessly. Z-Wave delivers high-quality networking at a fraction of the cost of other similar technologies by focusing on narrow bandwidth applications and negating the need for costly hardware by employing innovative software solutions like .NET microframework

A module with microcontroller and Z-wave ethernet gateway

Vizia RF Foyer is one such developed by Leviton manufacturing and its partners.

Vizia RF Foyer, the industry’s first Z-Wave-compliant Ethernet gateway. The Vizia RF Foyer connects to an Internet-linked PC or laptop through the computer’s Ethernet port and transmits signals to a Z-Wave home control network. Peer-to-peer mesh networks based on the .NET Micro Framework, the Vizia RF Foyer, and the Z-Wave protocol overcome the performance issues and high cost of earlier generations of wireless home control systems.

z-wave.jpg 

Each Vizia RF Foyer module is equipped with a two-way radio chip that it uses to communicate with modules, called nodes, in the network. Z-Wave command signals travel from node to node along the network to their final destination. If any form of interference blocks the signal along the way (for example, a wall or a large appliance such as a refrigerator), the signal is automatically rerouted through other nodes until it reaches its destination. The “self-healing” feature of the Z-Wave mesh network lends it unparalleled reliability.

An embedded application, such as the one in the Vizia RF Foyer, customarily takes approximately one year to develop. However, with the .NET Micro Framework, Leviton’s software partner ControlThink was able to produce a working proof of concept in three days and, porting its existing .Net code base, completed the final application within three months.

 

 

Licensing Model in SharePoint

There has been lot of confusion on the licensing information in SharePoint like what comes free, what is the licensed part of the product, what are the various client access license (CALs) options available. The following diagram helps clarify most of the doubts one will have around this.

Licensing.png 

Source: Microsoft Documentation 

The bottom layer shows the features available with WSS 3.0 which comes free of cost. The intermediate layer shows the features of MOSS available with Basic Client Access License (CAL). Most of the content management is possible with the basic client access license (CAL). However to leverage other services like that of accessing line of business application data, Excel Services or Form Services for viewing Infopath Forms through Browser, Additional Client Access Licenses will be required which is shown as the top most layer in the figure.

Note: In case of Internet Portals there is no concept of CALs and the licensing cost for MOSS is close to 8 folds the cost of MOSS server product.

 

February 16, 2008

Silverlight - Dynamic Languages

Microsoft has announced that Silverlight applications can be built using dynamic languages such as IronPython, IronRuby and Managed JScript. It's something really interesting because it provides an opportunity for the developers of any of these languages to build exciting Silverlight Applications.

Dynamic languages perform tasks at execution time without recompiling the source code. One of the reasons they are called "Dynamic" is that the typing in these languages is dynamic and in the source code there will not be any explicitly defined types. Types are created dynamically at execution time. With Silverlight 2.0, we have a DLR (Dynamic Language Runtime) which will provide a compiler to support each of these languages. It is executed above the CLR. It generates the code at execution time and makes the dynamic typing.

So what does this mean for a developer? Well! If we author the source code and run the Silverlight application and while the application is running, if we change the source code and refresh the browser, the changes are reflected in the application. Sounds very interesting, isn't it? Moreover, the Dynamic language initiative from Microsoft is not limited to Silverlight. The future releases of ASP .NET are likely to support these dynamic languages (IronPython in particular) for developing web applications. Interested developers can try this out using the ASP .NET Futures package (CTP).

ASP .NET AJAX - XML Script

Most of the developers today understand that it is all about authoring good JavaScript code to create good ASP .NET AJAX applications. It is infact true. However, there is an attempt from Microsoft to make this AJAX programming a much better experience in future. XML Script is a step in that direction. So what is XML Script?

XML Script is a declarative language that is added to the ASP .NET markup code. It is used to create the JavaScript objects at runtime and set the necessary properties and behavior for them. In Web based programming, we can separate the markup and style by creating a CSS file. In ASP .NET programming we can separate design and behavior in the markup file (.aspx) and code-behind file (.aspx.cs or .aspx.vb) respectively. XML Script does exactly the same to instantiate JavaScript components by using the declarative script language.

So what is so good about XML Script? Here are a few advantages:

- Being declarative itself one advantage. Will have semantics.
- Designers can be easily built for declarative code.
- Any declarative language is more expressive. This is no Exception!
- Avoids the need for us to deal with multiple event handlers, but keeping the object property values synchronized

XML Script currently is a part of the ASP .NET Futures package. It is not officially supported by Microsoft as of now. However, it will be good for the apetite of a programmer. The ASP .NET Futures package (CTP) can be downloaded from here.

Watch out this space for more on this.

February 14, 2008

Debugging WPF Databinding errors

For an application I was working on recently, I had a tough time with debugging some of the data binding issues. Then I came across this excellent blog. My personal favorite is the new Trace Level feature of .NET 3.5. For some reason, I could not get the second option to work.

February 5, 2008

Required Field Validation in WPF

In my earlier blog on user input validation in WPF, I had discussed about how extension methods can be used to provide for custom valiation logic. If your need is more simply only mandatory fields, a validation that the RequiredFieldValidation control provides in ASP.NET, you could very well use the concept of Data triggers in WPF.

One of the key benefits triggers offer is that they can be effectively managed within the XAML without usually requiring much of procedural code. Also since triggers are raised on specific actions/events, they are automatically undone when the action/event is over. This comes handy and saves code, since otherwise in procedural code, you need to handle both cases (something that is seen in my work with extension methods).

So if you have a need of say TextBox controls that should not be empty and should contain some value, you can create a Style with a Data trigger that fires when the TextBox's Text field is empty. The style will look something like below

        <Style x:Key="requiredFieldValidationStyle">

            <Style.Triggers>

                <DataTrigger Binding="{Binding RelativeSource={RelativeSource Self}, Path=Text}" Value="">

                    <Setter Property="TextBox.BorderBrush" Value="{StaticResource FaultyBorderBrush}" />

                    <Setter Property="TextBox.ToolTip" Value="Input value is mandatory" />

                </DataTrigger>

            </Style.Triggers>

        </Style>

Note how the trigger uses RelativeSource to bind back to it's own Text Property. If the Value of the property is an empty string (""), the BorderBrush is set appropriately to show a red border and a ToolTip is also attached to indicate what the error is. The BorderBush uses another local brush resource, which I defined as below

        <LinearGradientBrush x:Uid="LinearGradientBrush_1" x:Key="FaultyBorderBrush" EndPoint="0,20" StartPoint="0,0" MappingMode="Absolute">

            <GradientStop x:Uid="GradientStop_1" Color="#FFABADB3" Offset="0.05"/>

            <GradientStop x:Uid="GradientStop_2" Color="#FFE2E3EA" Offset="0.07"/>

            <GradientStop x:Uid="GradientStop_3" Color="#FFFF0000" Offset="1"/>

        </LinearGradientBrush>

Finally this Style can be applied to a TextBox and it will fire at run time based on the Text property of the TextBox

        <TextBox Style="{StaticResource requiredFieldValidationStyle}" />

A few things to remember
1. The Style created can be applied to any control. It will however not work for controls that don't expose a Text Property.

2. When the above style is added to the XAML, VS designer gives an error - "Value 'TextBox.ToolTip' cannot be assigned to property 'Property'. 'ToolTipService' type does not have a matching DependencyObjectType" and doesn't load the design view. The XAML loads properly in Expression Blend though. I am trying to figure out a fix for this. The code will however compile, build and execute without issues.

3. Another way to address the Style is to use the TargetType and set it to TextBox. This way the Style will be used for any TextBox on the XAML and you don't need to set it explicitly. The TextBox qualifiers from the Property names will also go away. The Style will look like the following

        <Style TargetType="TextBox">

            <Style.Triggers>

                <DataTrigger Binding="{Binding RelativeSource={RelativeSource Self}, Path=Text}" Value="">

                    <Setter Property="BorderBrush" Value="{StaticResource FaultyBorderBrush}" />

                    <Setter Property="ToolTip" Value="Input value is mandatory" />

                </DataTrigger>

            </Style.Triggers>

        </Style>

Another side benefit of this is that the VS designer works properly with this syntax.

4. The Style is only a visual indicator. In the code, you will still need to check the TextBox.Text for it being empty or not as part of your validation logic.

This hence, essentially gives you a pure XAML based visual representation for required field validation using the Data triggers.

February 4, 2008

Are Remote MSMQ Queues Reliable?

I have played around a bit with MSMQ private queues and documented some of my findings earlier. In my first blog on this topic, I had captured how the naming of the queue is critical to connect to the right queue.

If it was local private queue, you could use - ".\\private$\\queuename"

and if it was remote private queue, you use - "FormatName:Direct=OS:machinename\\private$\\queuename"

In my second blog, I mentioned about how to receive messages from remote private queues. The key to receiving message (apart from the correct queue format name) was to have an appropriate formatter so that the messages from the queue can be read properly.

What I do find strange is that attaching a formatter isn't necessary when sending messages. If a default formatting is plugged in automatically in that case, why can't one be plugged-in in case of receiving messages. One reason, I can think of is probably the outgoing formatting is decided based on the message type being sent out. While in case of reading, you don't know what is being read and can be a security issue as well if you try to read from the queue without really knowing what to expect.

Finally, in my third blog, I discussed about how to configure and use journal queues.

What has intrigued me is some of the comments that people have raised on these blog and my own experiment results. I document them there.

When working with remote private MSMQ queues, keep in mind that they aren't 100% reliable. Well the queues as such are, but the way the .net APIs function, they appear to be not. Very simply, an application can't easily find out if a particular private queue exists on a remote machine or not. You could use the static method MessageQueue.GetPrivateQueuesByMachine() and pass the remote machine name and get the list of queues on it, but when sending a message, if for some reason, the queue has been deleted the message will still be sent from the application and it will be lost.

MessageQueue.GetPrivateQueuesByMachine() returns a MessageQueue collection and you can access the FormatName property of individual queues in this collection, to get the correct format name for the remote private queues. In case the remote machine isn't accessible when firing the MessageQueue.GetPrivateQueuesByMachine() call, a System.Messaging.MessageQueueException is raised with the following details

ErrorCode - "-2147467259"
Message - "Remote computer is not available."
MessageQueueErrorCode - System.Messaging.MessageQueueErrorCode.RemoteMachineNotAvailable

If the remote machine is accessible, but the MSMQ service isn't running on that machine, the System.Messaging.MessageQueueException is still raised, but the details become cryptic and are as below

ErrorCode - "-2147467259"
Message - null
MessageQueueErrorCode - "-2147023143"

In case either the remote machine is down or the MSMQ service on that machine isn't running, when you try to send message from your application, the message is stored on a temporary outgoing queue that is created automatically on the local machine. From the application's point of view, the message is successfully delivered. Once the remote machine is up and the MSMQ service successfully running on it, the message from the temporary outgoing queue is automatically delivered. This looks like a good feature, but this also has its own dangers and you can get surprising behaviors. This is mainly because this entire logic is based only on validating the connectivity with the remote machine and a running MSMQ service on that machine. Even if the queue to which you are trying to send the message doesn't exists, the message will still go and essentially vanish since it isn't on local machine anymore and doesn't reach the remote machine, as the queue isn't present anyway.

Another similar issue is let's say the remote queue does exists, but doesn't has write access. Even in that case, the application will successfully send messages without realizing that the message is lost. However none of these issue exist if you are working with local private queues. In such cases, invalid queue names, improper security access rights etc are all validated and appropriate exceptions raised. Hence if you are facing issues of disappearing messages when sending to remote private queues, try running the application on the remote machine itself with the queue path appropriately changed (to point to local queue). This will help identify any queue name or security issues that might be causing the disappearing messages.

Does this means that private queues aren't good to be used? Not really ! They work very well when they are local. In case of remote private queues, the infrastructure automatically doesn't support detecting all error conditions and you have to explicitly verify the right queue name and security permissions.

If you want complete infrastructure support as well for remote queues, you should be considering public queues and not private queues.

Subscribe to this blog's feed

Follow us on

Blogger Profiles

Infosys on Twitter