Bulk Deletion of Notifications in Dynamics NAV

Posted May 11th, 2014 by Peik Bech-Andersen
Categories: Microsoft Dynamics NAV 2013

Have you ever experienced this:

For some reason, the NAS has not worked properly and the Notification page on the RoleCenter is flooded with messages. Next thing is to fix the problem with the NAS and delete the notifications.

No problem, just select all and then delete them, right?

NOT

You have to select them individually and Remove the notification:

Ok then, let’s run the table and delete the records from there (everything is easier with a developers license :))

Well that is possible, but that is hardly a user interface, we want to pass on to our users.

The notifications are not stored in a normal Dynamics NAV table, but in the Record Link table. Deleting all records in the Records Link table is not an option, because that would remove ALL real record links and intentional notes as well.

Running the table we can see the record link table.

The Type field will show if the record link points to a link or a note. Within the notes, we can distinguish between user notes and system notes by the User Id. So much more reason for running the NAS using a dedicated domain User ID.

The bottom line is a note on a customer, whereas the rest are the notifications from the NAS user.

Now we have enough information to create a report to do the deletion:

Using the Job Queue to find all NAS’es and combine it with the Record Link table,

I start by changing a report property:

Processing Only, There is no reason to involve the Visual Studio layout, unless you want documentation for the deletion.

Then a property on the Job Queue DataItem:

DataItemTableView, to remove the sorting option on the request page

RequestFilterFields, to enable filtering on the job queue code.

Then a property on the Record Link DataItem:

DataItemTableView, to remove the sorting option on the request page and to filter Type to Notes.

DataItemLink, to filter users with the User Id from the Job Queue.

ReqFilterFields, To enable filtering on Company Name and Creation Date.

Now everything is ready but I need a variable:

…and a text constant:

And a couple lines of code:

Save and exit

Now add the report to the IT Manager RoleCenter (9018):

Now run the report:

Take everything, please

And YES I am sure.

Press F5 and the RoleCenter looks like this:

Niiiice J

It you are lazy, you can download the objects for Dynamics NAV 2013 here.

The objects are shared as is, and no guarantees are given.

Finally a real author

Posted May 8th, 2014 by Peik Bech-Andersen
Categories: Microsoft Dynamics NAV 2013

Now I am officially a writer - I got my book.

Actually I got 1.517 copies.

Help me clear out the basement again :)

Buy one or download a free sample at http://b-a.dk

Using a mapped drive for backup in SQL server

Posted April 27th, 2014 by Peik Bech-Andersen
Categories: Microsoft Dynamics NAV 2013, SQL Server

SQL server has always had this little peculiarity. The available drives for import/export of data or backup is limited to the physical drives of the computer. Any mapped drives are not visible.

The drives before the mapping:

…and the mapping

…and the drives after the mapping on the computer

…and the drives in the SQL server after the mapping

The reason for this is that the SQL server is mostly running as a different user than the one you used to logon to the system. However, it is possible to trick the SQL server into seeing a mapped drive.

The demonstration here is performed in Windows Server 2008 R2 but the principle is the same in Windows 7 or Windows 8. The inspiration for this workaround comes from the backup system CrashPlan who has a similar workaround for backing up to a NAS device.

In order to make a mapping visible to the SQL server, you first need to find out, which user account the SQL server is logged on. So start by going to services:

Normally the SQL server would be running a domain user, but in this case, it is the Local System account.

Now we need to make a logon script, and place somewhere on the disk accessible to the Local System account. Here I choose the c:\temp folder and create a new file called MapNetworkDrive.cmd. In the file, I write:

Now we need to run this script as the Local System account every time the system is restarted.

The Task Scheduler can help us there:

Creating the new task as the SYSTEM account and with highest priorities:

Setting the triggers:

And the Actions:

Now it ready to test:

Now the mapping of the computer will look the same, but if you disconnect the drive, we just mapped a while ago, the mapping will look like this:

…but from SQL server it suddenly looks like this:

Problem solved.

Upgrade your knowledge about the manufacturing module in Microsoft Dynamics NAV

Posted April 14th, 2014 by Peik Bech-Andersen
Categories: Dynamics NAV Books, Manufacturing, Microsoft Dynamics NAV 2013

It has happened – the book is OUT!

E-book is available for download at http://b-a.dk

FREE preview is still available at http://ba-consult.dk

Paperback will be available from April 25th

But You can bundle the paperback and the download and start reading already tonight.

Upgrade your knowledge about the manufacturing module in Microsoft Dynamics NAV

Posted April 7th, 2014 by Peik Bech-Andersen
Categories: Dynamics NAV Books, Manufacturing, Microsoft Dynamics NAV 2013

It is with great pleasure that I can announce that my book is finished and YOU can get a free preview.

The book will be released: Friday April the 15th as e-book and Friday April 25th ad paperback.

But the preview is out already now.

Download the book at ba-consult.dk

Buy the book at http://b-a.dk
starting April 25th

DescriptionThe Manufacturing module of Microsoft Dynamics NAV has existed for almost 17 years. A quick estimate says that around 10% of the about 100.000 installed Microsoft Dynamics NAV solutions use the Manufacturing solution.

Despite this there is almost no literature on the subject.

My ambition with this book is to make “The Book” on Manufacturing in Microsoft Dynamics NAV. The book that all customers running the Manufacturing module and all consultants, both new and experienced, will need to setup and maintain the system.

The book will cover all versions from 4.0 up to NAV 2013 highlighting the new stuff and emphasizing obsolete functionality.
.

Style
The book is to be used either as a setup guide for implementing a new system or as a handbook for later adjustments. The table of contents follow the implementation tasks as they would fall in an implementation of a new manufacturing system.
Target Audience
• End-user customers who are going to implement the manufacturing module of Microsoft Dynamics NAV and
• New consultants needing to up their skills around manufacturing are the main audience.
• End-user customers who are already running the manufacturing module in the company will need it for the daily maintenance of their system, since there are no other material on the market.
• And lastly experienced consultants needs it for adjusting running manufacturing solutions

Don’t hesitate to download your copy here or you can register in the shop so you will get the latest information about the release.

I hope you like it.

Create a super user on a restored Dynamics NAV 2013 Database in SQL Script

Posted January 27th, 2014 by Peik Bech-Andersen
Categories: Microsoft Dynamics NAV 2013, SQL Server

Ever since Microsoft released Dynamics NAV 2013, there has been a discussion on the best way to create a user in a restored database.

The problem is this:

When a SQL backup is made and ported to a different environment, the Windows or Active Directory users that were created in the old environment are not necessarily created in the new environment.

It is always possible to restore the backup on the new SQL server, but that does not give me access to the database.

The easy way is to delete all information in four tables, but that will delete all existing users. If they are not needed then it is no problem, but if we want to keep the existing users then, this is not an option.

Deleting all users can be done running the following script.

If the user names are precisely the same just with a different domain then it is possible to “update” the windows security id with this script.

Combined with the syslogin table and a little SQL magic, it may even be possible to change all users from one domain to another and from one Windows Security id to another.

But what if I want a script to run, that will:

  • Create a user on the SQL server
  • Generate a random User SID
  • Create the user in the database
  • Add a role to the user
  • Give permission to the user in the new database

In one go.

It took a bit of Googling and help from dailycoding.com, Daniel Rimmelzwaan on Mibuso and other resources to compile this script, which I pass on to you without any guaranties and to serve as inspiration. Just add information on the username, the database and the permissions you want to assign to the user and run the script.

And you are in.

The scripts for 2013 and 2013R2 can be downloaded from here.

Using Dynamics NAV 2013 Query Objects in Reports and Pages

Posted January 27th, 2014 by Peik Bech-Andersen
Categories: Microsoft Dynamics NAV 2013

The Query object type was introduced in Microsoft Dynamics NAV 2013 for use in a limited number of places:

  • In the programming similar to a record variable
  • As source table in a Generic Chart
  • As an oData web service

This was a bit disappointing to me. I had hoped that the Query could function as a source table for other objects as well. I have therefor been experimenting with how to use Query objects as sources for both Reports and Pages. This is what I found out.

Firstly we need to create a Query object.

Creating the Query

So I create a Query Object that will calculate the customer top 10 sales, enabling filtering on Posting Date.

The dataset looks like this:

Notice that I create two filters:

  • Document Type to be hardcoded in the Query as Filter: “Invoice|Credit Memo”
  • Posting Date to be set when calling the Query from outside.

Also, notice that I made Sales LCY into a total.

I change the Query properties to sort by Sales LCY and to show only the first 10 rows:

Now I am ready to test the Query by running it from the designer and compare it against the Customer Top 10 Report:

This looks good, so now onto the report.

Creating the Report

Since the query object cannot be used directly as the dataset, an integer table will be used as dataset:

Then creating the query as a Global variable:

The Date variables is to be used on the Request Page:

And then a little bit of code:

So:

  • Set the filters
  • open the query in the OnPreDataItem
  • Read for each Query record
  • Break after the last record
  • Close the query

Then now we need to insert the Query Columns in the Dataset:

Using the Query Object the column: “Include Captions” cannot be used and custom captions need to be created in the Labels Designer:

Now the dataset is ready to test by running the report, previewing it and pressing Ctrl+Alt+F1 (About this Report) showing the dataset:

And now the Layout can be created in visual studio:

And now previewing the report:

Not pretty, but let’s see if a bit of “Pimping” can do the trick:

Conducting courses in Development of Microsoft Dynamics NAV 2009 and 2013, I hear a lot of complaining about the RTC report generator in and the link between NAV and Visual Studio. This report all included took me two hours to make and next time it’ll only take me one hour, so stop whining and learn using the almost endless possibilities there are instead.

Creating the page

If I want to use the query in a page instead, It’s almost simpler. Create the page again using the integer table. There is no use of the wizard unless FactBoxes are needed. Then Create the Query as a global variable:

Use the columns from the query as fields on the page:

And a little bit of code to:

  • Find the number of rows
  • Set the filter on the Integer table
  • Read the Query in the main loop

It looks like this:

And the result:

This could also be “Pimped” with FactBoxes, Charts and others.

But you can do that yourself.

Upgrading Reports to Microsoft Dynamics NAV 2013

Posted March 27th, 2013 by Peik Bech-Andersen
Categories: Microsoft Dynamics NAV 2013

Touring with my “What’s New Microsoft Dynamics NAV 2013 Installation & Development” course, one of the frequent topics are upgrading reports from earlier versions of Navision or Dynamics NAV to Dynamics NAV 2013. On demonstrating the upgrade process, I have stumbled upon several pitfalls. Actually there are so many shortcomings, I personally find, that if the report was originally based on a standard Dynamics NAV report, it will be faster to move the changes to the new standard Dynamics NAV 2013 report. However many reports exists in customer solutions that are 100% tailor made, and here we have only one option: upgrading.

Our problems origin from one of the following issues:

  • The form variable type is discontinued replaced by the page variable type.
  • The FORM object is discontinued replaced by the PAGE object. Meaning that all FORM.RUN commands must be replaced by PAGE.RUN commands.
  • The Request Form is discontinued replaced by the Request Page.
  • Posted dimensions and ledger entry dimensions are stored differently in Dynamics NAV 2013.
  • Code is no longer executed when placed in a hidden text field in the Page Header area of Visual Studio. This was possible in 2009 and very often used especially with CODE.GETDATA and CODE.SETDATA. Just to confuse matters; this only applies to printing the report. Previewing the report will look nicely (and you can actually print the report from preview – also invoices).

Upgrading to Dynamics NAV 2013 we have two situations:

  • We upgrade from a report run from the Classic Client, In this case the function from Dynamics NAV 2009 “Create Layout Suggestion” if performed automatically on upgrading.
  • We upgrade from a report run from an RTC Client in Dynamics NAV 2009. Here the Dynamics NAV 2009 RTC Layout is used.

    Let us take an example with two reports that originated from a 5.0 Classic:

  1. Report 25 Account Schedule
  2. Report 206 Sales Invoice

Importing the Reports into a Dynamics NAV 2013 database, the reports are not compiled. Trying to compile the report will give you the first error found by the compiler. However it is not possible to design the report. We therefor need to import the objects to a Dynamics NAV 2009 database, where all commands for both the RoleTailored Client and the Classic Client exists.

On upgrading from a report run from the Classic Client there are a number of things we need to prepare:

As mentioned, the first thing is to import the objects to a Dynamics NAV 2009 Database. The reason for this is that here we can change all variable defined as form to the page type as well as changing all FORM.RUN commands to PAGE.RUN. Then we need to remove all commands that will not work in 2013, which again means all commands referring to RequestOptionsForm.

Starting with the 25:

  • the first error message refers to a variable named DimValList which has the Form type. This must changed to the page type.
  • Second Error is the RequestOptionsForm control. This must be removed and replaced with an equivalent page command if necessary.

Every time, the object must be exported from the 2009 database and imported to the 2013 database for testing. Whenever the object can be imported to the 2013 database in compiled state, it is ready to be upgraded and further design.

Upgrading the report to Dynamics NAV 2013 is performed with a new function Tools/”Upgrade Reports”.

It will:

  1. Upgrade to the Dynamics NAV 2013 Dataset Designer. Which means that it takes all fields from the Section Designer and create them as columns in the Dataset Designer.
  2. Delete the Request Form
  3. Generate a Layout, using the Create Layout Suggestion.

After this, you must:

  1. Adjust the dataset to correspond to Dynamics NAV 2013, especially regarding dimensions.
  2. Move all labels from the dataset to the Label Designer. This is not imperative but it will upgrade the report to Dynamics NAV 2013 standards.
  3. Manually adjust the Layout.
  4. Create the Request Page.

Trying the same with report 206:

  • The RequestOptionsForm control must be removed and replaced with an equivalent page command.
  • Remove all dimension loops (Lines and header) from the dataset AND from the section. Remember the Variables pointing to Table 359. Alternatively, table 359 Posted Document Dimensions must be imported in the Dynamics NAV 2013 database, only to be removed again, when the dimension functionality has been replaced with Dynamics NAV 2013 functionality.
  • Import and Upgrade the Report

The upgrade function now try to fit everything into the page header. However without using the CODE.SETDATA and CODE.GETDATA. It therefore defined a large number of textboxes, named as variables, to be used as report items. They look something like this:

Which is not very elegant. These fields can be moved onto another Tablix control, which again can be hidden. However it is not possible to omit it and address the Field!Variable.Value from the dataset directly. This is due to the fact that it will always translate to the command first(field!Variable.Value), which will make absolutely no sense if you are printing more than one invoice at a time, because it will then repeat the customer from the first invoice on all invoices.

This boils down to what I wrote earlier: Start over using a standard Dynamics NAV 2013 report and move or add the customer adaptions again.

Using the same example with the same two reports this time originating from a 2009 RTC:

All the same issues described above also applies to the Dynamics NAV 2009 RTC reports. So on it again:

  • Remove all variables with the form variable type.
  • Replace all FORM.RUN by PAGE.RUN commands.
  • Delete all references to RequestOptionsForm.
  • Remove all references to Posted Document dimensions or Ledger Entry Dimensions.

So now we can Import the objects and upgrade them. Since the layout is already defined in the objects the upgrade process will reuse the existing layouts. And it does so quite well.

Only one thing need to be changed: Since Code is no longer executed when placed in a hidden text field in the Page Header area of Visual Studio, we now need to move the execution of the code from the value expression of the field to the hidden expression of the field:

Before:

After:

The strange thing is that it will show nicely on the preview, just not on the printed invoice.

Happy Reporting.

Inventory Batch Posting in Microsoft Dynamics NAV

Posted October 3rd, 2012 by Peik Bech-Andersen
Categories: dynamics nav 2009

Many people approach me with questions about the Inventory Batch Posting in Microsoft Dynamics NAV 2009 (or 4.0 or 5.0 for that matter). It’s not only customers, but also consultants, and there seem to be some confusion about the subject. So here’s how I usually present the matter on my courses and for my customers:

Firstly we need to address how a sale or purchase is posted in Navision. For that we need some flyers and a little explanation. To avoid confusion I omit everything else than the raw item posting (VAT, discounts etc.)

(1) When an Item is purchased one posting is made:

  1. Credit Vendor
  2. Debit Purchase

(2) Then One Item Ledger entry is made type Purchase. Well it’s actually the corresponding Value Entry that is basis for the calculation.

Then when the Adjust Cost batch and Post inventory cost to G/L is run:

(3) The Purchase posting is reversed through the Adjustment Account and the amount is posted on the Inventory account instead.

Likewise in with the sale:

(4) On posting the sale, the sales posting is made:

  1. Credit Sale
  2. Debit Customer

(5) Then One Item Ledger entry is made type Sale. Again it’s the corresponding Value Entry that is basis for the calculation.

Then when the Adjust Cost batch and Post inventory cost to G/L is run. The Adjust Cost Batch will look up the original price, which was used when the Item was put on inventory. The Post Inventory to G/L will post this:

(6) The original Cost will be removed from the Inventory and put onto the COGS (Cost Of Goods Sold) Account.

Many people see this as one operation, but it’s not. This is three operations:

  • Posting of the Purchase/Sale Document
  • Adjust Cost Batch
  • Post Inventory to G/L

And these operations can be separated and many do, mainly because of performance issues.

In the Inventory Setup we have 2 setup fields that determine this:

The Automatic Posting to G/L will determine if the Post Inventory to G/L should be run automatically on posting items to an item account. Even if this field is checked it’s still necessary to run the Adjust Cost – Item Entries Batch. However if the Automatic Cost Adjustment is set up this can help.

The Automatic Cost Adjustment Fields has these options.

The On-line Help shows the following table:


Option

Behavior

Never

Costs are not adjusted when posting

Day

Costs are adjusted if posting occurs within one day from the work date

Week

Costs are adjusted if posting occurs within one week from the work date

Month

Costs are adjusted if posting occurs within one month from the work date

Quarter

Costs are adjusted if posting occurs within one quarter from the work date

Year

Costs are adjusted if posting occurs within one year from the work date

Always

Costs are always adjusted when posting, irrespective of the posting date

 

This basically means that you can adjust the accuracy of your inventory. This means, that if you set up Days, you get the least accurate inventory because it will only adjust entries that are made one day from the Work date. And in order to get an accurate inventory you will need to run the Adjust Cost – Item Entries Batch before reconciling the inventory with the G/L.

Does this mean that if I set Automatic Cost Adjustment to always, then I will never have to run the Adjust Cost – Item Entries Batch again? No, if you use the Manufacturing module it’s necessary to run the Adjust Cost – Item Entries Batch every month. And actually I would always run it before closing the books on an audited fiscal period.

Ok - So far so good – But WHAT Posting Date will be used for these postings?

Well I just set up that it should ALWAYS Adjust my Cost and Post Inventory to G/L. This means that it will try to post the adjustment ON THE SAME DATE AS THE ORIGINAL TRANSACTION.

Hmmmmm

Imagine that we received an Item June 15th and we closed the books on June 30th.

Then on July 25th we received the invoice with the correct prices causing an adjustment to be made. However we are not very interested in changing our closed fiscal period.

How do we control this?

Well the problem implicates two different issues:

  1. What is the earliest Posting Date we can accept for new fiscal transactions?
  2. Which periods are open for users to post transactions in general and specific for each user?

This is controlled using three different setup tools. And not all of these are the obvious choices.

The first setup is the General Ledger Setup:

The fields “Allow Posting From” and “Allow Posting To” are the fields limiting the users to the period they are allowed to post in, right? Well almost. Firstly if you have setup another period in the User Setup, this will take precedence over the General Ledger Setup.

And something else, The “Allow Posting From” and “Allow Posting To” should NEVER be blank because that will enable all users to post in all periods, also old closed periods, back to 1753. And in Native Navision even back to when Jesus was born.

Secondly the “Allow Posting From” in the General Ledger Setup has an undocumented function, at least according to the on-line help. The “Allow Posting From” is also used by the Adjust Cost to define earliest acceptable posting date for the Post Inventory Cost to G/L batch.

Thirdly there’s one more setup that will interfere with our Inventory batch posting; The Inventory Periods. Ever wandered what they were?

If you setup an Inventory Period AND CLOSE IT, it will help defining the earliest possible date.

Ok so let’s see:

 

Here the earliest Posting Date will be July 1st for the Adjust cost and Post Inventory cost to G/L.

Users can in general post from July 1st to August 31st

Individual users can post from May 1st to August 31st

The Inventory period is open and will not influence anything

 

 

 

 

 

Here the earliest Posting Date will be August 1st for the Adjust cost and Post Inventory cost to G/L.

Users can in general post from July 1st to August 31st

Individual users can post from May 1st to August 31st

The Inventory period is closed and will therefor define the earliest possible Posting Date.

 

 

 

 

Here the earliest Posting Date will be May 1st for the Adjust cost and Post Inventory cost to G/L.

Users can in general post from May 1st to August 31st

Individual users can post from July 1st to August 31st

The Inventory period is open and will not influence anything

THIS will give endless problems and error messages to the users. Not to mention problems with having to open closed periods in order to run the Post Inventory to G/L.

 

 

 

I Hope this has shed some light on the issue.

 

Web Service Exampels

Posted September 27th, 2012 by Peik Bech-Andersen
Categories: dynamics nav 2009

Over the last couple of months, I’ve added my webservice examples to my own webserver, for all of you to see. For each example I’ll explain it graphically, what it takes in NAV and what it takes in Visual Studio.

The first example is VERY simple. I created it to time, just how long it took to add an extra website that utilizes webservices in NAV. Bear in mind that I had already the whole infrastructure in place: NAV server, Demo Database, the Business Webservices service and the IIS, so the project was basically this:

Employee Phone Directory

Many companies struggle with phone lists for their employees. More or less outdated Excel spreadsheets are lying around everywhere. Imagine if all Employees were maintained in one single database (NAV off-course), and everybody would just connect to a web page, that again dug into NAV showing the content of the Employee table from the Human Resource module.

Graphically it would look like this:

What would that take:

  • We need to create a new page, 50000 in this case, since I only need limited amount of information, and I feel that I can control the permissions better with a new object instead of using the standard page.
  • We need to publish the page as webservice.
  • Lastly we need to create a Web page in Visual studio using the DataGrid control

Let’s test it:

New page 50000 - Check - 5 Minutes.

New Webservice: Page 50000 Employee – Check – 5 minutes.

Web Page in Visual studio using the DataGrid Control – Check - 2 hours.

Total 2 hours and 10 minutes.

And here it is:     http://ba-consult.dk/employees

Fun eh?

Ok let’s try something a little more advanced: My customers should be able to log on to my system, see a list of all their invoices and have the invoice printed in the web page. Sounds familiar? Well the last part about showing the invoice within the web page, you can see in one of my earlier posts. But how about the rest? Because of my natural born laziness I decided to skip the login part and just let the customers filter for their own Customer No.

Invoice List

Graphically it would look like this:

What would that take:

  • We need to create a new page, Invoice List, again since I only need limited amount of information, and I feel that I can control the permissions better with a new object instead of using the standard page.
  • We need a codeunit with the GetInvoice Function.
  • We need to publish the page and the codeunit as webservices.
  • Lastly we need to create a Web page in Visual studio using the DataGrid control.
  • And we need to create the function that streams out the selected invoice to the web page.

Did this take 2 hours - No, not even close.

But it sure was fun.

Here it is: http://ba-consult.dk/GetInvoice

And another little thing: If you add this to your Home Screen on your A**le device like your iP** or your iP**** then you will see a neat little icon as well.