Rashed Amini

The ara3n weblog

AP Aging Report in SQL For Dynamics NAV

1st July 2014

I had posted the AR Aging report that customer can use to run their Aging reports that take a lot of time to run in NAV to run them on sql. Here is the AP Aging report on SQL.

Here is the file Link

Here is a screenshot of the running the query in SQL SMS


Posted in Dynamics NAV | No Comments »

AR Aging Report in SQL For Dynamics NAV

1st July 2014

I had posted a while back the Inventory Valuation sql query that can be used to run the report on sql server. SQL reports run much faster than NAV report because sql is doing all the data crunching and aggregation. Below is the sql statement for AR Aging report.
As you’ll notice the query return 8 columns instead of 4 columns that standard NAV returns.
The columns Based on Aging Days parameter. You can add the parameters on sql report and publish the query.

Here is the file Link

Here is a screenshot of the running the query in SQL SMS


Posted in Dynamics NAV | No Comments »

Storing and editing Large Text in NAV 2013

28th February 2014

As you probably know NAV has a limit on the length of text field size. It is set at 250 characters. There are scenarios where the client needs to store more data that is longer than 250 Character. If you look at any website that sells items, you’ll see detailed description that is made up of many paragraphs. In older version of NAV, partners were using the following solutions. They stored the data in blob field and opened the data to be edited in external application, e.g. Notepad, or automation, e.g. WaldoPad. The problem with this issue is that you need to install the automation on each machine and support it.
The other solution was to store it in text field but split the data across multiple records. For example in the comment line table or Extended Text line table. The problem with extended text or comment line is that editing these data becomes harder. If you changing a line in the middle of the paragraph then the all the lines bellow need to be moved as well.
The third solution is a mix of the two solutions above where you allow doing the editing in external application or automation and splitting the data and storing it in the table.
In NAV 2013 NAV has still the text field size limit, but there is no limit and text data type. You can create a text variable and add as much data into it and display set the source expression of a text box to the text variable and the user can modify it like a paragraph. Then at the end you can store the data either into a blob field or split the string into a table. The advantages of this solution is that you are not using any external application or COM component and you are doing it all within NAV.
Below is a demo solution that you can use in your projects and replace existing solution if you are upgrading.
In this demo I’m going to use the first record in Item table and store the data. Either in Comment line or Picture field, which is a blob field. You can change it based on your requirements and use one or the other solution to store the data.

nav notepad

Here is the link to the object. It is 2013 R2 object but can be implemented in 2013 as well.

Posted in Dynamics NAV | No Comments »

JobQueue in NAV 2013

26th February 2014

I had posted a while back a NAS solution to use two NAS instances to run jobs for unlimited companies. This solution worked great and saved money on buying NAS licenses for each company. In NAV 2013 NAS now runs on service tier. The old solution will not work. I had also written a second solution that used NAS and web service. It used MS xmldom COM automation object that could be used in NAV 2013 R2 but need to change the data type to dotNET data type. Automations are no longer supported on service tier. In NAV 2013 you no longer need these solutions. It comes by default with NAS. When you enable NAS in NAV Service Tier in 2013, it loops through all the companies and looks at Job Queue Entry table. You need to create a record in Job Queue record in each company and make sure the “Start automatically from NAS” is checked.

Job Queue Image

NAS will look at each company for Job queues and start a background session. These background sessions are run separately in each company and process any jobs that are setup in that company. So there is no need to install multiple NAS Service tier. The NAS Service is just used to start these background sessions and monitor them and make sure they are running, if one of them stops the NAS Service will start another background sessions. So if you have hundreds of companies, you will have in each company a background session running based on job queue record created in that company. These background sessions will process jobs concurrently, which is much better scalable solution. And the best part is that you no longer need licenses for these background sessions. So make sure you are running only one NAS windows service instance for JOB queue. If you need multiple sessions, create multiple Job Queue records with different category filters. People might also get confused with the table names. Job Queue = Background Session. Job Queue Entry = Codeunit/report that will perform a certain task such as run Adjust cost. The startup codeunit for nas is 450

Job Queue Image

Posted in Dynamics NAV 2013 | No Comments »

Automatic Lock Objects in NAV 2013 and 2013 R2

20th November 2013

Several years ago I had written a blog on checkout tool that written in SQL CLR. The solution was based on a sql table trigger on the object table and would write the data into a Checkout table. If another user tried to modify the same object, the user would get an error that somebody else had modified the object. In NAV 2009 Microsoft introduced the checkout functionality. They added a new field in the object table called Locked and Locked by and In Tools->Option Microsoft added the property “Auto-Lock on Design”. This option isn’t usefully since a lot of times a developer just designs objects for research purposes and doesn’t want to lock the object. A lot of NAV development involves finding where the code is and how it works. The original tool I had written still works in NAV 2013. The issue is that in order to check in an object you have to run the table and delete the record. As you know data manipulation is done from RTC only in NAV 2013. I’ve built a new solution that uses the new fields: Locked, Locked By, in the Object table. The solution does not use any new objects. It’ s just a SQL trigger on Object Tracking table. When an object is modified in NAV, the Development Client write/updates the Object Tracking table. The sql trigger then updates the object table Locked and Locked By fields.
Here is the script.

CREATE TRIGGER [AutoLock] ON [dbo].[Object Tracking]
DECLARE @Type int
DECLARE @ObjectID int
DECLARE @ChangeType int
DECLARE @LockedBY VarChar(132)

Set @Type = (select [Object Type] from inserted )
Set @ObjectID = (select [Object ID]from inserted )
Set @ChangeType = (select [Change Type] from inserted )

Set @LockedBY = (select [Locked By] from [Object] where Type=@Type and [ID] = @ObjectID)

if (@Type > 0) and (@ChangeType =0) and (@LockedBy = '') begin
UPDATE [Object]
set [Locked By] = SYSTEM_USER,
[Locked] = 1
where Type=@Type and [ID] = @ObjectID

This trigger only fires when the developer modifies/insert an object. This way you can design an object without worrying that it would lock the object. To release the Lock, the developer simply select one or more objects and selects Function->releases locks. If you are deleting objects, make sure you Lock them first. This solution can work with 2009 R2 as well. Enjoy.

Posted in Dynamics NAV, Dynamics NAV 2013 | Comments Off

Using Enumerators in C/AL to iterate through files in a folder

26th March 2013

Enumerators in .NET allow you to iterate through arrays and collections in your class. You can see this used for example in FOREACH Loop. Here is an example.

DirectoryInfo di = new DirectoryInfo("c:\\temp");
FileInfo[] files = di.GetFiles("*.pdf”);
foreach (FileInfo fileInfo in files)

There is no way to translate FOREACH loop in C/AL, but there is a workaround using List class.
In the above example, DirectoryInfo returns an array of FileInfo . In FOREACH loop we are looping through each file and printing it to console screen.

So the first line in NAV is initializing DirectoryInfo with constructor

DirectoryInfo := DirectoryInfo.DirectoryInfo('C:\temp\');

The second line is where we are changing and assigning it to a List. The CAL Compiler should error out but it is not. It looks like it’s not doing any type checking. At runtime it is determining the object type returned. DirectoryInfo.GetFiles().ToList() ;

List := DirectoryInfo.GetFiles('*.txt');
enumerator := List.GetEnumerator();

Once we have a list of enumerator class then we can loop through each object

WHILE enumerator.MoveNext DO BEGIN
FileInfo:= enumerator.Current();


Here is the whole Code with Variable type

DirectoryInfo DotNet System.IO.DirectoryInfo.’mscorlib, Version=, Culture=neutral, PublicKeyToken=b77a5c561934e089′
FileInfo DotNet System.IO.FileInfo.’mscorlib, Version=, Culture=neutral, PublicKeyToken=b77a5c561934e089′
List DotNet System.Collections.Generic.List`1.’mscorlib, Version=, Culture=neutral, PublicKeyToken=b77a5c561934e089′
enumerator DotNet System.Collections.IEnumerator.’mscorlib, Version=, Culture=neutral, PublicKeyToken=b77a5c561934e089′

DirectoryInfo := DirectoryInfo.DirectoryInfo('C:\temp\');

List := DirectoryInfo.GetFiles('*.txt');

enumerator := List.GetEnumerator();

WHILE enumerator.MoveNext DO BEGIN
FileInfo:= enumerator.Current();


Posted in DotNet, Dynamics NAV 2013 | Comments Off

NAV 2013 Chart Add-in

5th March 2013

NAV 2013 comes with a new Chart add-in. In NAV 2009 we had basic charts that are xml files that could be displayed on a page. In this blog I’ll show the steps required to build a basic Chart add-in.
1. Create a new Page of type CardPart. The source table needs to be Business Chart Buffer table.
2. Click finish. You should see the Page in Design mode with just a ContentArea line.
3. Add a new field line below ContentArea. Enter MyChart in name field. The SourceExp should be blank. Go to property of the line and select the ControlAddIn property to Microsoft.Dynamics.Nav.Client.BusinessChart. You Page should look like the picture below.

4. For this Chart I will show sales orders by location as example.
5. Then go to code and add the following OnFindRecord trigger.

And that’s it. If you run the Page directly from Object Designer you will get an error.
“The control add-in on control MyChart on page MyChart Example has not been instantiated.”
You will need to add it as a part to view it or run it from another page as an action menu.
Page.Runmodal(Page::” MyChart Example”);
The Page looks like this in Cronus Company with demo data.

The above example is a simple chart but you can build any kind of other charts. You have to consider performance when writing your chart. If you are querying any historical tables with large data, I suggest creating a table to store the calculated data and on open page to check if there is any changes and only update the changes. You could also schedule the updates on NAS. There are additional settings that you can set. For further research take a look at “Trailing Sales Orders Chart” Page.
Building charts based on this add-in is quite powerful , simple, and can be developed rapidly.
For example just changing the code a little

AddMeasure('Sales' ,0,"Data Type"::Decimal,"Chart Type"::Pie);
Location.SETRANGE("Use As In-Transit",FALSE);

SalesHeader.SETRANGE("Location Code",Location.Code);
I += 1;

UNTIL Location.NEXT = 0;

Gives you a nice Pie Chart.

Attached is the object link.

Posted in Dynamics NAV | Comments Off

How to remove Namespaces in XMLport in NAV 2013

7th December 2012

MS in Dynamics NAV 5.0 created XMLPort datatype. It was a great tool to load xml files but it had trouble dealing with namespaces. MS released a workaround code for this that allowed removing all the namespaces of the xml file. The Code looked like this.


In Dynamics NAV 2013 32 bit com objects are no longer supported on service tier and the suggestion from MS is to move to dotNET data types for automation. I’ve used the above solution in many of my projects and I’m sure many partners have used this for your existing solutions. When you are going to upgrade you will run into this code and will need to be upgraded to System.xml dotNET data type. The problem that partners will have is trying to implement transformNodeToObject function. I’ve provided a solution below that allows you to do the same process for System.xml.


So your XML files will look before the transformation like this.


And after transformation like this.


Attached is the CU object

Posted in Dynamics NAV | Comments Off

Upgrade from NAV 4.0 sp3 to NAV 2013 Beta

12th July 2012

There are many clients that are still on an old version of Dynamics NAV and have not upgraded to 2009. Many of these clients decided to not invest in upgrading to a newer version because they didn’t see any value. I would agree with many of those customers. 2009 did require more development hours and testing than previous upgrades. Many features were missing from RTC so some users were using classic while others were using RTC. Customizing reports was cumbersome. Modifying forms and Pages would go out of sync. There were certain scenarios where doing just executable upgrade to 2009 were sufficient enough to use web services features.
I would like to share the experience of upgrading an existing Dynamics NAV 4.0 to Dynamics NAV 2013 beta. The client was on 4.0 and support for 4.0 had expired a while back and the client was interested in moving to a supported version. Moving to 2009 didn’t make sense since the 2013 release was right around the corner. The client wanted to see the application with their data and go through their workflows in 2013, so that when the final build is released they will be ready to upgrade to it. The client is also using the C/Side database. The work was assigned to a junior developer with a couple of months of experience with my supervision. This client had about 350 objects modified: 82 tables, 109 forms, 96 reports, and 30 Code units. This is a medium sized number of objects.
I suggested merging the tables first, and then merging code units, then merging forms and last merging reports. Based on his experience with this merge, he thought merging tables was easy and went quickly using text compare tools. Merging the Codeunits was a little more difficult, some of them have changed dramatically so it was harder to compare in text. This is mainly because he was comparing modified 4.0 to 2013 txt files. He had missed a couple of functions, but I helped him resolve those. To upgrade the Forms, I suggested that he redo the changes manually in pages instead of using the form transformation tool. This allowed him to get more experience with designing and working with pages and get more familiar with the application. His comment was that it was his favorite objects to merge. He had a text compare tool compare 4.0 modified to 4.0 standards and make the changes manually in Pages.
When he started upgrading the reports, we came to the conclusion that in order to test them and make sure they work correctly we needed actual data. So we decided to upgrade the data first and then upgrade the reports and run them to confirm that you get the same results as classic. There is not a direct upgrade path from 4.0 to 2013, so we went the route of upgrading to 2009 as an intermediate step. You need a 2009 database with all the custom fields, and you can create that by copying and pasting all the fields in 50K range from the virtual table into a Cronus database. You cannot paste option fields, so those you have to paste directly into each table.
We imported the Step 1 2009 CU and ran it and then loaded the 2009 Cronus database with custom fields and started getting into issues related to NA Payroll. Payroll was part of the North American localization in 4.0 and MS removed it in 5.0 there were a couple of fields on employee table that the customer was using. We had to create custom fields and move the data from those HR fields and blank them out since they were going to be deleted in 2009. Step 2 of 2009 ran without any issues and Step 1 of 2009 was loaded into the database and ran without issues. Then we opened the DB with 2013 classic client and this upgraded the DB moving all the text fields to Unicode. Next we loaded Step 2 of 2013 and this part you have to run from RTC. The process is supposed to create the new Dimension Set ID. Here we ran into another issue. The customer had deleted a dimension value record instead of blocking it and we were getting a dimension Set ID value cannot be blank. I recreated the value and we ran into a second issue. There were Ledger entry dimension records with blank dimension value. I don’t know how they got created, perhaps during the upgrade in 2006 to 4.0 that created those entries. I deleted them as well and then the Step 2 finished with the upgrade.
When we started the upgrade the database was 40 GB, when we finished the size of the DB was 270 GB. 85 GB was the data and 190 was the log file. So make sure you have plenty of space to finish the upgrade process.
We decided to manually make the changes in the modified reports. This will give the developer more experience with 2013 report designer just like the pages. Upgrading custom 50000 range reports was one of the biggest surprises. It was a bit too easy. You import the object and compile them and then you go to tools->upgrade reports. If your report does not compile because of some reference or code, go back to the 4.0 DB and comment out the code. Then reimport and compile it, then select upgrade report. Most of the reports so far have been upgraded without any additional changes. They look identical to classic reports. There were a couple of reports such as Checks that needed manual tweaking, but those reports required tweaking during each upgrade in the past.
The client is in the process of looking at 2013 with their data. They are planning to train their users, who are not that computer savvy and there is a lot of things they have to learn. This way they will be ready when 2013 is released.

Posted in Dynamics NAV | Comments Off

Compressing Warehouse Entry in Dynamics NAV 2013

3rd July 2012

My last blog was about how to improve performance if you are NAV warehouse system. In my solution I used a sql script that summarizes the warehouse Entry table by Item,Variant,zone,bin,lot,serial,UOM and inserts the data into a new 50K range table and then a nav processing reports deletes all the warehouse entry and inserts the summarized entries into warehouse entry. In NAV 2013, MS has proved a new object type Query for C/Side Developers. You can use this object to do fast retrieval of data from SQL and let sql server to do some of the processing and calculation. I’ve rebuilt the solution in NAV 2013 using query and a CodeUnit. This makes the solution much easier to schedule nightly to run and have an optimized fast database with warehouse system.

Here is a screenshot of the query object for summarize warehouse entry.

Query Object

And here is the Codeunit.

Query CodeUnit

Attached are the objects

Posted in Dynamics NAV | Comments Off