Rashed Amini

The ara3n weblog

Upgrade from NAV 4.0 sp3 to NAV 2013 Beta

12th July 2012

There are many clients that are still on an old version of Dynamics NAV and have not upgraded to 2009. Many of these clients decided to not invest in upgrading to a newer version because they didn’t see any value. I would agree with many of those customers. 2009 did require more development hours and testing than previous upgrades. Many features were missing from RTC so some users were using classic while others were using RTC. Customizing reports was cumbersome. Modifying forms and Pages would go out of sync. There were certain scenarios where doing just executable upgrade to 2009 were sufficient enough to use web services features.
I would like to share the experience of upgrading an existing Dynamics NAV 4.0 to Dynamics NAV 2013 beta. The client was on 4.0 and support for 4.0 had expired a while back and the client was interested in moving to a supported version. Moving to 2009 didn’t make sense since the 2013 release was right around the corner. The client wanted to see the application with their data and go through their workflows in 2013, so that when the final build is released they will be ready to upgrade to it. The client is also using the C/Side database. The work was assigned to a junior developer with a couple of months of experience with my supervision. This client had about 350 objects modified: 82 tables, 109 forms, 96 reports, and 30 Code units. This is a medium sized number of objects.
I suggested merging the tables first, and then merging code units, then merging forms and last merging reports. Based on his experience with this merge, he thought merging tables was easy and went quickly using text compare tools. Merging the Codeunits was a little more difficult, some of them have changed dramatically so it was harder to compare in text. This is mainly because he was comparing modified 4.0 to 2013 txt files. He had missed a couple of functions, but I helped him resolve those. To upgrade the Forms, I suggested that he redo the changes manually in pages instead of using the form transformation tool. This allowed him to get more experience with designing and working with pages and get more familiar with the application. His comment was that it was his favorite objects to merge. He had a text compare tool compare 4.0 modified to 4.0 standards and make the changes manually in Pages.
When he started upgrading the reports, we came to the conclusion that in order to test them and make sure they work correctly we needed actual data. So we decided to upgrade the data first and then upgrade the reports and run them to confirm that you get the same results as classic. There is not a direct upgrade path from 4.0 to 2013, so we went the route of upgrading to 2009 as an intermediate step. You need a 2009 database with all the custom fields, and you can create that by copying and pasting all the fields in 50K range from the virtual table into a Cronus database. You cannot paste option fields, so those you have to paste directly into each table.
We imported the Step 1 2009 CU and ran it and then loaded the 2009 Cronus database with custom fields and started getting into issues related to NA Payroll. Payroll was part of the North American localization in 4.0 and MS removed it in 5.0 there were a couple of fields on employee table that the customer was using. We had to create custom fields and move the data from those HR fields and blank them out since they were going to be deleted in 2009. Step 2 of 2009 ran without any issues and Step 1 of 2009 was loaded into the database and ran without issues. Then we opened the DB with 2013 classic client and this upgraded the DB moving all the text fields to Unicode. Next we loaded Step 2 of 2013 and this part you have to run from RTC. The process is supposed to create the new Dimension Set ID. Here we ran into another issue. The customer had deleted a dimension value record instead of blocking it and we were getting a dimension Set ID value cannot be blank. I recreated the value and we ran into a second issue. There were Ledger entry dimension records with blank dimension value. I don’t know how they got created, perhaps during the upgrade in 2006 to 4.0 that created those entries. I deleted them as well and then the Step 2 finished with the upgrade.
When we started the upgrade the database was 40 GB, when we finished the size of the DB was 270 GB. 85 GB was the data and 190 was the log file. So make sure you have plenty of space to finish the upgrade process.
We decided to manually make the changes in the modified reports. This will give the developer more experience with 2013 report designer just like the pages. Upgrading custom 50000 range reports was one of the biggest surprises. It was a bit too easy. You import the object and compile them and then you go to tools->upgrade reports. If your report does not compile because of some reference or code, go back to the 4.0 DB and comment out the code. Then reimport and compile it, then select upgrade report. Most of the reports so far have been upgraded without any additional changes. They look identical to classic reports. There were a couple of reports such as Checks that needed manual tweaking, but those reports required tweaking during each upgrade in the past.
The client is in the process of looking at 2013 with their data. They are planning to train their users, who are not that computer savvy and there is a lot of things they have to learn. This way they will be ready when 2013 is released.

Posted in Dynamics NAV | Comments Off

Compressing Warehouse Entry in Dynamics NAV 2013

3rd July 2012

My last blog was about how to improve performance if you are NAV warehouse system. In my solution I used a sql script that summarizes the warehouse Entry table by Item,Variant,zone,bin,lot,serial,UOM and inserts the data into a new 50K range table and then a nav processing reports deletes all the warehouse entry and inserts the summarized entries into warehouse entry. In NAV 2013, MS has proved a new object type Query for C/Side Developers. You can use this object to do fast retrieval of data from SQL and let sql server to do some of the processing and calculation. I’ve rebuilt the solution in NAV 2013 using query and a CodeUnit. This makes the solution much easier to schedule nightly to run and have an optimized fast database with warehouse system.

Here is a screenshot of the query object for summarize warehouse entry.

Query Object

And here is the Codeunit.

Query CodeUnit

Attached are the objects

Posted in Dynamics NAV | Comments Off

Compressing Warehouse Entry

15th May 2012

There have been many topics on compression/deleting subledger/ ledger in NAV. The table that I would like to talk about is the Warehouse Entry. If a location is setup with bins or zones, you can see entries recording every activity that has been performed for an item/bin/zone/unit of measure/Lot/Serial in warehouse entry. The table assigns consecutive numbers to all entries, and the entries appear by entry number order. The activities include every removal and placement of items in the bins and every adjustment registered to the bins. Dynamics NAV creates the warehouse entries whenever you post or register a warehouse document or journal. As you can already guess, this table can quickly grow very large as users register transactions. Standard NAV provides a method to compress warehouse entries. This process is running in CAL is very slow and time consuming process. I’ve built the following solution that deletes all the warehouse entries and recreates them with a SQL script and a NAV processing report.
The process I followed is as follows.
In Object Designer, find Warehouse Entry table design and saved it as a new table. I called it “Warehouse Entry Compressed” with Table ID 50085. The purpose is that if you have done some customization and added some custom fields they all exist in the new table.
Afterwards run the SQL script that will populate this new table. If you have custom fields, you will need to include them in the SQL script.
INSERT INTO [dbo].[Cronus$Warehouse Entry Compressed]
([Entry No_]
,[Journal Batch Name]
,[Line No_]
,[Registering Date]
,[Location Code]
,[Zone Code]
,[Bin Code]
,[Item No_]
,[Qty_ (Base)]
,[Source Type]
,[Source Subtype]
,[Source No_]
,[Source Line No_]
,[Source Subline No_]
,[Source Document]
,[Source Code]
,[Reason Code]
,[No_ Series]
,[Bin Type Code]
,[Journal Template Name]
,[Whse_ Document No_]
,[Whse_ Document Type]
,[Whse_ Document Line No_]
,[Entry Type]
,[Reference Document]
,[Reference No_]
,[User ID]
,[Variant Code]
,[Qty_ per Unit of Measure]
,[Unit of Measure Code]
,[Serial No_]
,[Lot No_]
,[Warranty Date]
,[Expiration Date]
,[Phys Invt Counting Period Code]
,[Phys Invt Counting Period Type]

[Entry No_]
,[Journal Batch Name]
,[Line No_]
,[Registering Date]
,[Zone Code]
,[Source Type]
,[Source Subtype]
,[Source No_]
,[Source Line No_]
,[Source Subline No_]
,[Source Document]
,[Source Code]
,[Reason Code]
,[No_ Series]
,[Bin Type Code]
,[Journal Template Name]
,[Whse_ Document No_]
,[Whse_ Document Type]
,[Whse_ Document Line No_]
,[Entry Type]
,[Reference Document]
,[Reference No_]
,[User ID]
,[Variant Code]
,[Qty_ per Unit of Measure]
,[Warranty Date]
,[Expiration Date]
,[Phys Invt Counting Period Code]
,[Phys Invt Counting Period Type]

Select *
(Select [Item No_] as ItemNo,[Location Code] as LocationCode,Sum([Quantity]) as SQuantity, SUM([Qty_ (Base)]) as QtyBase,
[Bin Code] as BinCode,[Lot No_] as LotNo,[Serial No_] as SerialNo, [Unit of Measure Code] as UOM,
(SELECT Top 1 [Entry No_]
FROM [dbo].[Cronus$Warehouse Entry] as WEntry
where WE.[Item No_] = WEntry.[Item No_] and
WE.[Bin Code]= WEntry.[Bin Code] and
WE.[Lot No_]= WEntry.[Lot No_] and
WE.[Location Code] = WEntry.[Location Code] and
WE.[Unit of Measure Code] = WEntry.[Unit of Measure Code] and
WE.[Serial No_] = WEntry.[Serial No_]
order by [Entry No_] desc) AS EntryNo
from [dbo].[Cronus$Warehouse Entry] as WE
group by [Item No_],[Location Code],[Bin Code],[Lot No_],[Serial No_],[Unit of Measure Code]
) as Summary
where QtyBase 0) as Summary2

Left Join

[dbo].[Cronus$Warehouse Entry]
on (Summary2.EntryNo = [Entry No_])

Then run report 50085 Compressed Warehouse Entry. This report deletes all the warehouse entries and recreates them based on compressed entries.
There are many advantages to cleaning up this table. It improves performance, concurrency and saves space.

Here is the link for the text object and sql script.

Posted in Dynamics NAV | Comments Off

NAV 2013 Beta Released

14th May 2012

Microsoft Today released on PartnerSource the beta release of NAV 2013. Here is the URL. It is released for the first 14 countries.
I suggest to download it and start studying it.

Posted in Dynamics NAV | Comments Off

Role Tailored Client Localization

24th October 2011

Recently I was asked to help with localization of Dynamics NAV for Korea. I was involved a couple of years ago with another project that we implemented in Korea and so I knew the process on how to localize NAV classic client. The project was on version 4.0 sp3. Microsoft has not localized NAV for any country in Asia, except India. On partner source you can find the download for all localized countries / regions. Instead local partners in those countries localize it themselves. The process involves creating caption for all the fields, errors, messages, and reports. To localize the executable, you have to translate fin.stx. In addition Korea uses double-byte character sets (DBCS), which you need to enable in fin.stx. The following link provides steps on how to enable DBCS on you computer

Once you are done with translation, you need to send fin.stx to Microsoft to “close” the file. It basically creates a checksum in the file.
This is how classic client looks in Korean.
NAV Korean

With Role Tailored client for NAV 2009 R2 fin.stx is no longer used. Instead you need to download a template project provided by Microsoft. Here is the link

Once you open the project, you will need rename resources.en-CA.resx to your language codes. On MSDN you will find a list of all the country codes. The Download link above contains a word document with links inside them. Once you have renamed all the files, you will need to go through each project and sign the assembly. These steps are explained in details in the download above. This is a screenshot of solution in Visual Studio.

Visual Studio NAV Korean

You will need to translate all the resx files and compile/build the project. Visual Studio will create in debug folder all the necessary DLL files. If you see a folder en-CA in the output folder when compiling the project/solution, it means you have missed renaming all the resx files. Since I don’t know Korean I left all the translation as is. A native speaking person will do a much better job translating this.
Once the DLL were created, you will need to create a folder (ko-KR) under the RTC for the language and put the dll files as well as all the help files. Under the service tier you will need two folders. (KOR and ko-KR) KOR olds all the help files and ko-KR contains the dll files.

The Role Tailored client only runs dll files that are signed and in order to use them you will to run the Client Add-in and specify all the dll files and their public Key Token. This is how it should look like.

NAV addin table

Once you have done all the steps, restart the service tier. Start the RTC and you should see your language in the select Language option. If you don’t see it, it means you’ve made a mistake somewhere in your process. Here is a screenshot of RTC in Korean.

RTC NAV Korean

You’ll notice that a lot of stuff is still in English and the reason is that I have not translated the resx files. But existing translation for the existing caption translated is still present. Thus it will require much less effort to finish the translation.
The new method to translate the RTC makes it very easy for partners to localize the RTC for countries that MS is not releasing NAV. It no longer requires MS involvement with fin.stx. Hopefully in future RTC will be Unicode compliant and you wouldn’t need to do anything special with DBCS or unchecking Validate Collation setting.

Posted in Dynamics NAV, Role Tailored Client | Comments Off

Integrating with MSMQ using dotNET data types in NAV 2009 R2

2nd April 2011

With the release of NAV 2009 R2, we can now use dotNET data types. This allows CAL programmers to access .NET framework within CAL. There are many benefits on using dotNET data types as opposed to Automation data types. First is distribution of your CAL code. You simply send the fob to the client. If you are referencing any .NET framework class, then by default it’s installed on the box with NAV RTC. Second reason is performance. dotNET data types will perform better than COM Automations. Third, the code is available to see and edit in CAL.

Over the past several years I’ve integrated with many other systems using Microsoft Message Queue (MSMQ). The other systems were for example; BizTalk, websites, 3rd party Manufacturing systems, or handheld units. I’ve used the Automation provided by MS “Navision MS-Message Queue Bus Adapter”. It had many limitations and in certain integration I had to use “Microsoft Message Queue 3.0 Object Library”. I’ve decided to rewrite the solution using dotNET.

The solution below allows sending and receiving MSMQ messages using dotNET variables. It consists of two code units. Code unit 50010 “Send MSMQ” sends an xml file. Code unit 50011 “Receive MSMQ” reads the xml files and stores them on c drive. You can off course change this based on your requirement.

Here is a screenshot of Send MSMQ

Here is a screenshot of Receive MSMQ

The code is simple, but it took me a while to get it to work. The main reason is the MessageFormatter. In dotNET you need to specify the formatter for your messages. By default MSMQ uses XmlMessageFormatter but I could not make it work in CAL.
The C# code looks like this.
q.Formatter = new XmlMessageFormatter(new Type[] { typeof(string) });

So instead I am using
Q.Formatter := QActiveXFormatter.ActiveXMessageFormatter;

ActiveXFormatter is used for backward compatibility. You could also send Binary message format. You code would look like this.

Q.Formatter := QBinaryFormatter.BinaryMessageFormatter;

The reason is that there is no equivalent of typeof in CAL. If you find a workaround, post a response. If you have any solution using Automation, it is worth to take a look at implementing it in dotNET. If you are still using classic client, you can run this code using web service on service tier. Here is the link the objects in fob and text format MSMQ.zip.

Posted in DotNet, Dynamics NAV | No Comments »

The Query returned no rows for the data set

11th February 2011

If you are using RTC and previewing certain reports you probably will run into this error. “An error occurred during local report processing.” The Hidden expression for the textbox ‘……’ contains an error: The query returned no rows for the data set. The expression therefore evaluates to null.

RDLC print preview

I searched around and didn’t find any info on this and started researching with trial and error and found that it is simply not an error from NAV perspective. It simply means that there is not data to print. The way RTC reports work is that the reports are generated as XML files on service tier and sent to the client which uses it as a data source to render the RDL.
This simply means based service tier returned an empty xml file. So why would service tier return an empty xml file. After looking at different reports, I’ve concluded that if PrintOnlyIfDetail property on a data item is set, then Service tier removes parent records if there are no children.
I suggest telling your customers/ end users that there is no data based on filters. If they are not ok with it, then you have to dummy integer data item at the end of the report and set it to print one blank line if there are no Parents data item to print.

Posted in Dynamics NAV | No Comments »

Using ADO on RTC in NAV

10th January 2011

On one of my recent projects I had to build a solution to update a NAV table with quantity on hand by location. Writing a processing report to maintain update and maintain the table would have taken less than an hour. But for this project performance of a NAV report would have not worked. The client has about 1000 locations and about 80,000 Items. Worst case scenario, you could end up with 80 million records in this table. The solution I provided uses ADO to connect to SQL server and issue a SQL statement on Item Ledger Entry Indexed view.
My solution was based on Waldo’s post

This worked fine on classic client, but when I tried to run this on Role Tailored Client (RTC), I was getting the following error.

The expression Variant cannot be type-converted to a String value.

The error was from the following line.
lADOCommand.ActiveConnection := lvarActiveConnection;

This basically means that you cannot use ‘Microsoft ActiveX Data Objects 2.8 Library’ in RTC. So what other option are available? You could build an external dll file that wraps the active x DLL file. But that creates many headaches with registering and maintaining the dll file on every workstation. The other solution that I’m going to talk about is that in NAV 2009 R2, Microsoft released a new data type called DotNet. With this data type, you can access the .Net framework and reference the classes in NAV. So the code for accessing NAV using DotNet datatype looks like this.

ServerName := 'VIRTUALXP-51168';
NavDb := 'Demo Database NAV 6R2';

ConnectionString := 'Data Source='+ServerName+';'
+ 'Initial Catalog='+NavDb+';'
+ 'Trusted_Connection=True;';

SQLConnection := SQLConnection.SqlConnection(ConnectionString);

SQLCommand := SQLConnection.CreateCommand();
SQLCommand.CommandText := 'select * From Session';
SQLReader := SQLCommand.ExecuteReader;

MESSAGE( 'Reading %1 , %2 ',SQLReader.GetInt32(0), SQLReader.GetString(1));



In the example above I’m simply reading the session table and printing the “Connection ID” and “User ID”. In addition you can assign to SQLCommand.CommandText sql statement that are greater than 1024 characters.
Here are the DotNet data types for the above code.

Name DataType Subtype
SQLConnection DotNet ‘System.Data, Version=, Culture=neutral, PublicKeyToken=b77a5c561934e089′.System.Data.SqlClient.SqlConnection
SQLCommand DotNet ‘System.Data, Version=, Culture=neutral, PublicKeyToken=b77a5c561934e089′.System.Data.SqlClient.SqlCommand
SQLReader DotNet ‘System.Data, Version=, Culture=neutral, PublicKeyToken=b77a5c561934e089′.System.Data.SqlClient.SqlDataReader

If you are on older version of NAV, you have to do executable upgrade to Dynamics NAV 2009 R2 to be able to use DotNet object types.

Posted in DotNet, Dynamics NAV | 1 Comment »

String Implementation in NAV

25th November 2010

If you have developed with any OCX or COM objects in Dynamics NAV, you have probably run into the following error. “The length of the text string exceeds the size of the string buffer.” The reason you would get this error is because you were passing a string from COM object to NAV. NAV limits COM object string length to 1024 characters. In older version it was 250 characters but they increased it to 1024 characters. The workaround this limitation you had to build a COM wrapper for the DLL file and split the string into 1024 or smaller sizes and pass it to NAV. This solution was cumbersome with rolling it out on every PC and maintaining it, as well compiling the objects if you didn’t have the wrapper. I had suggested to MS to remove this limit and they are working on it. Freddy who was in one of those meetings with MS suggested that you could use Variant type to get around the problem. I didn’t try to look for a solution until recently I received an email from a client that they were running into this problem. Here is the solution. This only works on Role Tailored client and on classic you will still receive the error. If you are running on classic, you could use NAV web service and allow it to run the NAV code.
In the code below, xmlDom.xml returns a string of the whole xml as string, which is more than 1024 characters.


MyVariant := xmlDom.xml;

Name DataType
MyVariant Variant
xmlDom Automation 'Microsoft XML, v6.0'.DOMDocument
mytext Text 1024
BText BigText

NAV 2009 R2 which is being released on December 15th 2010 will allow developer in NAV to use the .NET framework in NAV. NAV has introduced a new object type DotNet. The solution will work with DotNet objects as well. Here is an example code.

xmlDom := xmlDom.XmlDocument;
MyVariant := xmlDom.InnerXml;


The above code is very similar to the COM example except xmlDom is DotNet object type with subtype Systeml.xml.XmlDocument .
‘System.Xml, Version=, Culture=neutral, PublicKeyToken=b77a5c561934e089′.System.Xml.XmlDocument

Posted in Dynamics NAV | 2 Comments »

Downgrade 2009 sp1 objects into 5.x or older databases

9th April 2010

Most of modification I do, I try to reuse them in other projects. I have built this generic integration objects that can be used in 80 % of the integration scenarios that NAV integrates with other systems. I’ve built it in 2009 sp1 and one of our clients was running on 5.0. As you probably know, loading fob from 2009 into older version is not possible. It crashes older NAV clients. The workaround is to export them from 2009 as text and then removing 2009 specific areas out of the text file and then importing them into 5.0 or older executables. I’m sure many partners and VARS are in same scenarios. They do their development on latest version of NAV and back port it to older version if required.
I’ve built this tool that takes a text file and automatically removes 2009 sp1 properties and allows it to import the text file into 5.0 or older version.
Downgrade NAV objects

I have testing this all nav standard objects by loading them from 2009 into 5.0 sp1.
Remember you do not need to the tool for forms, dataports, and codeunits. This works for tables and xmlports and reports.

Here is the link.

Posted in Dynamics NAV | 10 Comments »