Rashed Amini

The ara3n weblog

Storing and editing Large Text in NAV 2013

28th February 2014

As you probably know NAV has a limit on the length of text field size. It is set at 250 characters. There are scenarios where the client needs to store more data that is longer than 250 Character. If you look at any website that sells items, you’ll see detailed description that is made up of many paragraphs. In older version of NAV, partners were using the following solutions. They stored the data in blob field and opened the data to be edited in external application, e.g. Notepad, or automation, e.g. WaldoPad. The problem with this issue is that you need to install the automation on each machine and support it.
The other solution was to store it in text field but split the data across multiple records. For example in the comment line table or Extended Text line table. The problem with extended text or comment line is that editing these data becomes harder. If you changing a line in the middle of the paragraph then the all the lines bellow need to be moved as well.
The third solution is a mix of the two solutions above where you allow doing the editing in external application or automation and splitting the data and storing it in the table.
In NAV 2013 NAV has still the text field size limit, but there is no limit and text data type. You can create a text variable and add as much data into it and display set the source expression of a text box to the text variable and the user can modify it like a paragraph. Then at the end you can store the data either into a blob field or split the string into a table. The advantages of this solution is that you are not using any external application or COM component and you are doing it all within NAV.
Below is a demo solution that you can use in your projects and replace existing solution if you are upgrading.
In this demo I’m going to use the first record in Item table and store the data. Either in Comment line or Picture field, which is a blob field. You can change it based on your requirements and use one or the other solution to store the data.

nav notepad

Here is the link to the object. It is 2013 R2 object but can be implemented in 2013 as well.

Posted in Dynamics NAV | No Comments »

JobQueue in NAV 2013

26th February 2014

I had posted a while back a NAS solution to use two NAS instances to run jobs for unlimited companies. This solution worked great and saved money on buying NAS licenses for each company. In NAV 2013 NAS now runs on service tier. The old solution will not work. I had also written a second solution that used NAS and web service. It used MS xmldom COM automation object that could be used in NAV 2013 R2 but need to change the data type to dotNET data type. Automations are no longer supported on service tier. In NAV 2013 you no longer need these solutions. It comes by default with NAS. When you enable NAS in NAV Service Tier in 2013, it loops through all the companies and looks at Job Queue Entry table. You need to create a record in Job Queue record in each company and make sure the “Start automatically from NAS” is checked.

Job Queue Image

NAS will look at each company for Job queues and start a background session. These background sessions are run separately in each company and process any jobs that are setup in that company. So there is no need to install multiple NAS Service tier. The NAS Service is just used to start these background sessions and monitor them and make sure they are running, if one of them stops the NAS Service will start another background sessions. So if you have hundreds of companies, you will have in each company a background session running based on job queue record created in that company. These background sessions will process jobs concurrently, which is much better scalable solution. And the best part is that you no longer need licenses for these background sessions. So make sure you are running only one NAS windows service instance for JOB queue. If you need multiple sessions, create multiple Job Queue records with different category filters. People might also get confused with the table names. Job Queue = Background Session. Job Queue Entry = Codeunit/report that will perform a certain task such as run Adjust cost. The startup codeunit for nas is 450

Job Queue Image

Posted in Dynamics NAV 2013 | No Comments »

Automatic Lock Objects in NAV 2013 and 2013 R2

20th November 2013

Several years ago I had written a blog on checkout tool that written in SQL CLR. The solution was based on a sql table trigger on the object table and would write the data into a Checkout table. If another user tried to modify the same object, the user would get an error that somebody else had modified the object. In NAV 2009 Microsoft introduced the checkout functionality. They added a new field in the object table called Locked and Locked by and In Tools->Option Microsoft added the property “Auto-Lock on Design”. This option isn’t usefully since a lot of times a developer just designs objects for research purposes and doesn’t want to lock the object. A lot of NAV development involves finding where the code is and how it works. The original tool I had written still works in NAV 2013. The issue is that in order to check in an object you have to run the table and delete the record. As you know data manipulation is done from RTC only in NAV 2013. I’ve built a new solution that uses the new fields: Locked, Locked By, in the Object table. The solution does not use any new objects. It’ s just a SQL trigger on Object Tracking table. When an object is modified in NAV, the Development Client write/updates the Object Tracking table. The sql trigger then updates the object table Locked and Locked By fields.
Here is the script.


CREATE TRIGGER [AutoLock] ON [dbo].[Object Tracking]
AFTER INSERT, UPDATE
AS
DECLARE @Type int
DECLARE @ObjectID int
DECLARE @ChangeType int
DECLARE @LockedBY VarChar(132)

Set @Type = (select [Object Type] from inserted )
Set @ObjectID = (select [Object ID]from inserted )
Set @ChangeType = (select [Change Type] from inserted )

Set @LockedBY = (select [Locked By] from [Object] where Type=@Type and [ID] = @ObjectID)

if (@Type > 0) and (@ChangeType =0) and (@LockedBy = '') begin
UPDATE [Object]
set [Locked By] = SYSTEM_USER,
[Locked] = 1
where Type=@Type and [ID] = @ObjectID
end

This trigger only fires when the developer modifies/insert an object. This way you can design an object without worrying that it would lock the object. To release the Lock, the developer simply select one or more objects and selects Function->releases locks. If you are deleting objects, make sure you Lock them first. This solution can work with 2009 R2 as well. Enjoy.

Posted in Dynamics NAV, Dynamics NAV 2013 | Comments Off

Using Enumerators in C/AL to iterate through files in a folder

26th March 2013

Enumerators in .NET allow you to iterate through arrays and collections in your class. You can see this used for example in FOREACH Loop. Here is an example.

DirectoryInfo di = new DirectoryInfo("c:\\temp");
FileInfo[] files = di.GetFiles("*.pdf”);
foreach (FileInfo fileInfo in files)
{
Console.WriteLine(FileInfo.Name)
}

There is no way to translate FOREACH loop in C/AL, but there is a workaround using List class.
In the above example, DirectoryInfo returns an array of FileInfo . In FOREACH loop we are looping through each file and printing it to console screen.

So the first line in NAV is initializing DirectoryInfo with constructor

DirectoryInfo := DirectoryInfo.DirectoryInfo('C:\temp\');

The second line is where we are changing and assigning it to a List. The CAL Compiler should error out but it is not. It looks like it’s not doing any type checking. At runtime it is determining the object type returned. DirectoryInfo.GetFiles().ToList() ;


List := DirectoryInfo.GetFiles('*.txt');
enumerator := List.GetEnumerator();

Once we have a list of enumerator class then we can loop through each object

WHILE enumerator.MoveNext DO BEGIN
FileInfo:= enumerator.Current();
MESSAGE('%1',FileInfo.Name);

END;

Here is the whole Code with Variable type

DirectoryInfo DotNet System.IO.DirectoryInfo.’mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089′
FileInfo DotNet System.IO.FileInfo.’mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089′
List DotNet System.Collections.Generic.List`1.’mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089′
enumerator DotNet System.Collections.IEnumerator.’mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089′

DirectoryInfo := DirectoryInfo.DirectoryInfo('C:\temp\');

List := DirectoryInfo.GetFiles('*.txt');

enumerator := List.GetEnumerator();

WHILE enumerator.MoveNext DO BEGIN
FileInfo:= enumerator.Current();
MESSAGE('%1',FileInfo.Name);

END;

Posted in DotNet, Dynamics NAV 2013 | Comments Off

NAV 2013 Chart Add-in

5th March 2013

NAV 2013 comes with a new Chart add-in. In NAV 2009 we had basic charts that are xml files that could be displayed on a page. In this blog I’ll show the steps required to build a basic Chart add-in.
1. Create a new Page of type CardPart. The source table needs to be Business Chart Buffer table.
2. Click finish. You should see the Page in Design mode with just a ContentArea line.
3. Add a new field line below ContentArea. Enter MyChart in name field. The SourceExp should be blank. Go to property of the line and select the ControlAddIn property to Microsoft.Dynamics.Nav.Client.BusinessChart. You Page should look like the picture below.
Chart1

4. For this Chart I will show sales orders by location as example.
5. Then go to code and add the following OnFindRecord trigger.
Chart2

And that’s it. If you run the Page directly from Object Designer you will get an error.
“The control add-in on control MyChart on page MyChart Example has not been instantiated.”
You will need to add it as a part to view it or run it from another page as an action menu.
Page.Runmodal(Page::” MyChart Example”);
The Page looks like this in Cronus Company with demo data.
Chart3

The above example is a simple chart but you can build any kind of other charts. You have to consider performance when writing your chart. If you are querying any historical tables with large data, I suggest creating a table to store the calculated data and on open page to check if there is any changes and only update the changes. You could also schedule the updates on NAS. There are additional settings that you can set. For further research take a look at “Trailing Sales Orders Chart” Page.
Building charts based on this add-in is quite powerful , simple, and can be developed rapidly.
For example just changing the code a little


AddMeasure('Sales' ,0,"Data Type"::Decimal,"Chart Type"::Pie);
Location.SETRANGE("Use As In-Transit",FALSE);
IF Location.FINDSET THEN REPEAT

AddColumn(Location.Code);
SalesHeader.SETRANGE("Location Code",Location.Code);
SetValue('Sales',I,SalesHeader.COUNT);
I += 1;

UNTIL Location.NEXT = 0;
Update(CurrPage.MyChart);

Gives you a nice Pie Chart.
Chart4

Attached is the object link.

Posted in Dynamics NAV | Comments Off

How to remove Namespaces in XMLport in NAV 2013

7th December 2012

MS in Dynamics NAV 5.0 created XMLPort datatype. It was a great tool to load xml files but it had trouble dealing with namespaces. MS released a workaround code for this that allowed removing all the namespaces of the xml file. The Code looked like this.

OldRemoveNameSpaces

In Dynamics NAV 2013 32 bit com objects are no longer supported on service tier and the suggestion from MS is to move to dotNET data types for automation. I’ve used the above solution in many of my projects and I’m sure many partners have used this for your existing solutions. When you are going to upgrade you will run into this code and will need to be upgraded to System.xml dotNET data type. The problem that partners will have is trying to implement transformNodeToObject function. I’ve provided a solution below that allows you to do the same process for System.xml.

RemoveNameSpaces

So your XML files will look before the transformation like this.

XMLFilewithNameSpaces

And after transformation like this.

XMLFIleNoNameSpaces

Attached is the CU object

Posted in Dynamics NAV | Comments Off

Upgrade from NAV 4.0 sp3 to NAV 2013 Beta

12th July 2012

There are many clients that are still on an old version of Dynamics NAV and have not upgraded to 2009. Many of these clients decided to not invest in upgrading to a newer version because they didn’t see any value. I would agree with many of those customers. 2009 did require more development hours and testing than previous upgrades. Many features were missing from RTC so some users were using classic while others were using RTC. Customizing reports was cumbersome. Modifying forms and Pages would go out of sync. There were certain scenarios where doing just executable upgrade to 2009 were sufficient enough to use web services features.
I would like to share the experience of upgrading an existing Dynamics NAV 4.0 to Dynamics NAV 2013 beta. The client was on 4.0 and support for 4.0 had expired a while back and the client was interested in moving to a supported version. Moving to 2009 didn’t make sense since the 2013 release was right around the corner. The client wanted to see the application with their data and go through their workflows in 2013, so that when the final build is released they will be ready to upgrade to it. The client is also using the C/Side database. The work was assigned to a junior developer with a couple of months of experience with my supervision. This client had about 350 objects modified: 82 tables, 109 forms, 96 reports, and 30 Code units. This is a medium sized number of objects.
I suggested merging the tables first, and then merging code units, then merging forms and last merging reports. Based on his experience with this merge, he thought merging tables was easy and went quickly using text compare tools. Merging the Codeunits was a little more difficult, some of them have changed dramatically so it was harder to compare in text. This is mainly because he was comparing modified 4.0 to 2013 txt files. He had missed a couple of functions, but I helped him resolve those. To upgrade the Forms, I suggested that he redo the changes manually in pages instead of using the form transformation tool. This allowed him to get more experience with designing and working with pages and get more familiar with the application. His comment was that it was his favorite objects to merge. He had a text compare tool compare 4.0 modified to 4.0 standards and make the changes manually in Pages.
When he started upgrading the reports, we came to the conclusion that in order to test them and make sure they work correctly we needed actual data. So we decided to upgrade the data first and then upgrade the reports and run them to confirm that you get the same results as classic. There is not a direct upgrade path from 4.0 to 2013, so we went the route of upgrading to 2009 as an intermediate step. You need a 2009 database with all the custom fields, and you can create that by copying and pasting all the fields in 50K range from the virtual table into a Cronus database. You cannot paste option fields, so those you have to paste directly into each table.
We imported the Step 1 2009 CU and ran it and then loaded the 2009 Cronus database with custom fields and started getting into issues related to NA Payroll. Payroll was part of the North American localization in 4.0 and MS removed it in 5.0 there were a couple of fields on employee table that the customer was using. We had to create custom fields and move the data from those HR fields and blank them out since they were going to be deleted in 2009. Step 2 of 2009 ran without any issues and Step 1 of 2009 was loaded into the database and ran without issues. Then we opened the DB with 2013 classic client and this upgraded the DB moving all the text fields to Unicode. Next we loaded Step 2 of 2013 and this part you have to run from RTC. The process is supposed to create the new Dimension Set ID. Here we ran into another issue. The customer had deleted a dimension value record instead of blocking it and we were getting a dimension Set ID value cannot be blank. I recreated the value and we ran into a second issue. There were Ledger entry dimension records with blank dimension value. I don’t know how they got created, perhaps during the upgrade in 2006 to 4.0 that created those entries. I deleted them as well and then the Step 2 finished with the upgrade.
When we started the upgrade the database was 40 GB, when we finished the size of the DB was 270 GB. 85 GB was the data and 190 was the log file. So make sure you have plenty of space to finish the upgrade process.
We decided to manually make the changes in the modified reports. This will give the developer more experience with 2013 report designer just like the pages. Upgrading custom 50000 range reports was one of the biggest surprises. It was a bit too easy. You import the object and compile them and then you go to tools->upgrade reports. If your report does not compile because of some reference or code, go back to the 4.0 DB and comment out the code. Then reimport and compile it, then select upgrade report. Most of the reports so far have been upgraded without any additional changes. They look identical to classic reports. There were a couple of reports such as Checks that needed manual tweaking, but those reports required tweaking during each upgrade in the past.
The client is in the process of looking at 2013 with their data. They are planning to train their users, who are not that computer savvy and there is a lot of things they have to learn. This way they will be ready when 2013 is released.

Posted in Dynamics NAV | Comments Off

Compressing Warehouse Entry in Dynamics NAV 2013

3rd July 2012

My last blog was about how to improve performance if you are NAV warehouse system. In my solution I used a sql script that summarizes the warehouse Entry table by Item,Variant,zone,bin,lot,serial,UOM and inserts the data into a new 50K range table and then a nav processing reports deletes all the warehouse entry and inserts the summarized entries into warehouse entry. In NAV 2013, MS has proved a new object type Query for C/Side Developers. You can use this object to do fast retrieval of data from SQL and let sql server to do some of the processing and calculation. I’ve rebuilt the solution in NAV 2013 using query and a CodeUnit. This makes the solution much easier to schedule nightly to run and have an optimized fast database with warehouse system.

Here is a screenshot of the query object for summarize warehouse entry.

Query Object

And here is the Codeunit.

Query CodeUnit

Attached are the objects

Posted in Dynamics NAV | Comments Off

Compressing Warehouse Entry

15th May 2012

There have been many topics on compression/deleting subledger/ ledger in NAV. The table that I would like to talk about is the Warehouse Entry. If a location is setup with bins or zones, you can see entries recording every activity that has been performed for an item/bin/zone/unit of measure/Lot/Serial in warehouse entry. The table assigns consecutive numbers to all entries, and the entries appear by entry number order. The activities include every removal and placement of items in the bins and every adjustment registered to the bins. Dynamics NAV creates the warehouse entries whenever you post or register a warehouse document or journal. As you can already guess, this table can quickly grow very large as users register transactions. Standard NAV provides a method to compress warehouse entries. This process is running in CAL is very slow and time consuming process. I’ve built the following solution that deletes all the warehouse entries and recreates them with a SQL script and a NAV processing report.
The process I followed is as follows.
In Object Designer, find Warehouse Entry table design and saved it as a new table. I called it “Warehouse Entry Compressed” with Table ID 50085. The purpose is that if you have done some customization and added some custom fields they all exist in the new table.
Afterwards run the SQL script that will populate this new table. If you have custom fields, you will need to include them in the SQL script.
INSERT INTO [dbo].[Cronus$Warehouse Entry Compressed]
([Entry No_]
,[Journal Batch Name]
,[Line No_]
,[Registering Date]
,[Location Code]
,[Zone Code]
,[Bin Code]
,[Description]
,[Item No_]
,[Quantity]
,[Qty_ (Base)]
,[Source Type]
,[Source Subtype]
,[Source No_]
,[Source Line No_]
,[Source Subline No_]
,[Source Document]
,[Source Code]
,[Reason Code]
,[No_ Series]
,[Bin Type Code]
,[Cubage]
,[Weight]
,[Journal Template Name]
,[Whse_ Document No_]
,[Whse_ Document Type]
,[Whse_ Document Line No_]
,[Entry Type]
,[Reference Document]
,[Reference No_]
,[User ID]
,[Variant Code]
,[Qty_ per Unit of Measure]
,[Unit of Measure Code]
,[Serial No_]
,[Lot No_]
,[Warranty Date]
,[Expiration Date]
,[Phys Invt Counting Period Code]
,[Phys Invt Counting Period Type]
)

Select
[Entry No_]
,[Journal Batch Name]
,[Line No_]
,[Registering Date]
,LocationCode
,[Zone Code]
,BinCode
,[Description]
,ItemNo
,SQuantity
,QtyBase
,[Source Type]
,[Source Subtype]
,[Source No_]
,[Source Line No_]
,[Source Subline No_]
,[Source Document]
,[Source Code]
,[Reason Code]
,[No_ Series]
,[Bin Type Code]
,[Cubage]
,[Weight]
,[Journal Template Name]
,[Whse_ Document No_]
,[Whse_ Document Type]
,[Whse_ Document Line No_]
,[Entry Type]
,[Reference Document]
,[Reference No_]
,[User ID]
,[Variant Code]
,[Qty_ per Unit of Measure]
,UOM
,SerialNo
,LotNo
,[Warranty Date]
,[Expiration Date]
,[Phys Invt Counting Period Code]
,[Phys Invt Counting Period Type]

From
(
Select *
From
(Select [Item No_] as ItemNo,[Location Code] as LocationCode,Sum([Quantity]) as SQuantity, SUM([Qty_ (Base)]) as QtyBase,
[Bin Code] as BinCode,[Lot No_] as LotNo,[Serial No_] as SerialNo, [Unit of Measure Code] as UOM,
(SELECT Top 1 [Entry No_]
FROM [dbo].[Cronus$Warehouse Entry] as WEntry
where WE.[Item No_] = WEntry.[Item No_] and
WE.[Bin Code]= WEntry.[Bin Code] and
WE.[Lot No_]= WEntry.[Lot No_] and
WE.[Location Code] = WEntry.[Location Code] and
WE.[Unit of Measure Code] = WEntry.[Unit of Measure Code] and
WE.[Serial No_] = WEntry.[Serial No_]
order by [Entry No_] desc) AS EntryNo
from [dbo].[Cronus$Warehouse Entry] as WE
group by [Item No_],[Location Code],[Bin Code],[Lot No_],[Serial No_],[Unit of Measure Code]
) as Summary
where QtyBase 0) as Summary2

Left Join

[dbo].[Cronus$Warehouse Entry]
on (Summary2.EntryNo = [Entry No_])

Then run report 50085 Compressed Warehouse Entry. This report deletes all the warehouse entries and recreates them based on compressed entries.
There are many advantages to cleaning up this table. It improves performance, concurrency and saves space.

Here is the link for the text object and sql script.

Posted in Dynamics NAV | Comments Off

NAV 2013 Beta Released

14th May 2012

Microsoft Today released on PartnerSource the beta release of NAV 2013. Here is the URL. It is released for the first 14 countries.
I suggest to download it and start studying it.

Posted in Dynamics NAV | Comments Off