Implementing the RoleTailored Client with success

Posted April 26th, 2015 by Peik Bech-Andersen
Categories: Microsoft Dynamics NAV 2013

Have I been awfully quiet lately?

That might be because I have been preparing a one-day pre-conference workshop I am going to conduct on this year’s NAV TechDays in Antwerp, Belgium.

You can read more about it on the NAV TechDays webpage.

The topic is:

RoleTailoring Microsoft Dynamics NAV

Strangely enough, that is also the title of my new book. It is almost done and will be available in your favorite web-shop this summer. I will release a free preview version in May.

The content of the book and the workshop will be something like this:

You can sign up for the pre-conference workshop at NAV TechDays already in May. Remember to keep an eye out for the new book. You can start with the free preview, which will keep you going until the final book comes out.




Saving a Customer/Vendor/Item as a template in Dynamics NAV

Posted March 21st, 2015 by Peik Bech-Andersen
Categories: Dynamics Nav 2013 R2

Going Fishing in the MiniApp

As I described in my previous blog post, it is possible to reuse functionality from the MiniApp in the “real” Dynamics NAV.

Here is another example:

In Page 1302 Mini Item Card, there is a sweet little function to save an existing item as a template:

Ok - Go steal:

First, go to the development environment and design page 1302 Mini Item Card.

Go to View/Page Actions:

Copy – Exit and do not save if asked.

Then into page 30 Item Card and into page actions:

Paste next to the Apply Template Action – document the change – save and exit.

Now open the item card on an existing item:

Save as template:

Make a new item:

Enter – and apply a description

It’s Sooooo Easy – 3½ minutes including documentation.

Exists on the Mini Customer Card and the Mini Vendor Card as well.

Happy Fishing


Pimping the RoleCenter with Financial data in Dynamics NAV

Posted March 15th, 2015 by Peik Bech-Andersen
Categories: Dynamics Nav 2013 R2

Microsoft launched the MiniApp with Dynamics NAV 2013 R2. The MiniApp was not advertised widely in the Dynamics NAV community, mostly because the MiniApp was the essential part of the Microsoft Dynamics C5 2014 rollout. Dynamics C5 is a local Danish product and therefore not very interesting for the NAV people in the rest of the world. Later the MiniApp has been the topic of sessions in conferences around the world for its design patterns, but it is still not very well known. The MiniApp consists of a separate set of objects and there are many interesting functionalities to steal – ups sorry – to be inspired by. One of the cool things is that the RoleCenter has been expanded with a page part showing financial data.

To see it, first find the SMALL BUSINESS profile and to make it default or assign it to yourself:

Then restart the RoleTailored Client to show the Small Business RoleCenter:

The interesting part is the Trial Balance in the lower right corner. Using F6 to change focus to the Trial Balance window, Ctrl+Alt+F1 can be used to find the object number in the page (Some systems already use the Ctrl+Alt+F1 key - In that case Ctrl+Windows+Alt+F1 keys can be used instead):

Bingo: page 1393. Now, let us see if we can figure out how to fly this thing.

Here we have two interesting menu items:

Chart Information:

Hmmmm not enough so let us try the customize menu item:

Nothing here – time to dive into the code:

Following the Code into the MiniTrialBalanceMgt codeunit:

Ok so this is actually a page and a codeunit to show an Account Schedule on the RoleCenter. But what Account schedule? There are no official page to change the MiniTrialBalanceSetup data, but that can soon be fixed:

So the Account Schedule is the I_MINITRIAL. Let us check it out:

And the Periods:

So now we are in familiar waters. All we need now is to implement the Page 1393 onto our own RoleCenter, and make an Account Schedule that fits our needs. I am going to implement it on a copy of the President profile:

Assign a profile to the new object number and assign it to the president:

With a little imagination, there are dozens of uses for this little functionality:

  • Key numbers for Management
  • Department balance for the Department Head
  • Sales reporting for the sales people
  • Purchase reporting for the purchasers

The setup could be attached to the RoleCenter showing different Account Schedules for different roles or even users.

Looking at the Small Business RoleCenter, one could also consider using the Key Performance Indicators page part in the same way.

As I see it, the MiniApp can make a Maximum impact.


Debugging Dynamics NAV in a side-by-side installation

Posted March 15th, 2015 by Peik Bech-Andersen
Categories: Microsoft Dynamics NAV 2013

Many consultants have more than one version of Dynamics NAV installed. End-users in a migration phase will also often have more than one NAV installed. In that case, the “installed” version can be different from the executed version. If this is the case, an error like “The client version does not match the server version” occur.

In this case it is because I am trying to run the Sessions list from the RoleTailored Client in the Dynamics NAV 2013 R2 but the registered client is pointing towards to the newly installed Dynamics NAV 2015 version. This can be changed relatively simply, but even more important, it is possible to make a set of files to change the desired RoleTailored Client back for forth between the different versions.

The change must be made in RegEdit and the simplest way to make the file is to go to RegEdit and search the key that needs changing:

Inside Regedit, locate the key:


The content of the Key is something like this:

C:\Program Files (x86)\Microsoft Dynamics NAV\80\RoleTailored Client\Microsoft.Dynamics.Nav.Client.exe -protocolhandler “%1″

Now right-click on the command key and choose export:

Make a couple of copies of the files:

Now right click each file and change the 80 to 71 or 70 depending on the version:

Windows Registry Editor Version 5.00



@=”C:\\Program Files (x86)\\Microsoft Dynamics NAV\\71\\RoleTailored Client\\Microsoft.Dynamics.Nav.Client.exe -protocolhandler \”%1\”"

Hereafter all it takes to switch is to click one file and Answer yes to the warning:

Now debugging is possible in all installed versions after preparing the environment a little.




Tips and Tricks using the Dynamics NAV Debugger

Posted March 10th, 2015 by Peik Bech-Andersen
Categories: Microsoft Dynamics NAV 2013

Dynamics NAV 2013 introduced a new debugger to replace the old and worn-out debugger in the classic client. The classic debugger in Dynamics NAV 2009 did not support the RoleTailored Client and that caused a lot of swearing and a lot of work-around.

The new debugger has some new functionality that has been on the wish list for many years:

  • Watching variables in all levels of the call-stack
  • Debugging NAS’es
  • Debugging client sessions
  • Avoiding the annoying Caption Class translation in Codeunit 1
  • Creating conditional breakpoints
  • Everything is coded in C/AL, which means that it is possible to change the existing functionality and to add extra functionality.

Enabling/Disabling the Debugger

First of all the debugger is no longer connected to the classic client or the development environment. It is connected to a session. A session can be:

  • RoleTailored Clients
  • Web Clients
  • Tablet Clients (Only Dynamics NAV 2015)
  • NAS Services
  • Web Services
  • oData Services

This means that it is possible to debug any of the above. In the settings of each instance, it is also possible to prevent debugging of a session.

Two settings seem to affect the debugger:

Debugging Allowed:    If the field is not selected, it is not possible to debug the session.

Enable Debugging:    When the client first connects, all C# files for the application are generated. It has therefore nothing to do with the debugger we are going to work with.

To prevent debugging in the live environment, it is recommendable to keep the Debugging Allowed clear in all live sessions and create special sessions for debugging purposes.

Starting the Debugger

There are three ways to start the debugger:

Start from Development Environment:

The menu item Tools/Debugger/Debug Session will bring up a list of the active instances/sessions available to debug. It will only show instances/sessions connected to the database that has been opened in the develop environment.

It is possible to “Hard Code” which instance the development environment is connected to by going to Tools/Options and type in a different server/instance name:

The session list will show all the different sessions and client types currently connected to the instance.

Start from the RoleTailored Client:

In the RoleTailored Client, first it is necessary to connect to the instance that needs debugging. Once in, go to the Sessions menu item located at Departments/Administration/IT Administration/General/Sessions (or cheat by pressing Ctrl+F3 and type sessions).

Start from the Run command:

Lastly, it is possible to start from the run command:


Notice that it is not possible execute the command from the search field in Windows 7, it must be the Run command.

The sessions list

From the sessions list it is possible to activate the debugger depending on what to debug:


An error is occurring in the selected session, which by the way could be operated by another user.


A breakpoint has been set in the code in the development environment or previously in the debugger, and the debugger will stop on the breakpoint. There are also other breakpoint options but starting up it is only possible to set the breakpoint in the code in the development environment.

Debug Next:

Stop on the first error/breakpoint in ANY of the active sessions. This is a little bit dangerous since any error made by any unsuspecting user will trigger the debugger, and for the user the session will “hang” all of a sudden while you take over their session. Later we will go through some of the uses of the Debug Next function.

Start Full SQL Tracing:

This is used when tracing the performance in the SQL profiler. This can be a topic of another blog post.

Setting the breakpoint in the Development Environment

From the Development Environment it is possible to set the breakpoint in the code somewhere. This however is only possible if the license includes Solution Developer. Just go to the object that needs debugging, find a place in the code and set a breakpoint by pressing F9. There is also a menu item Tools/Debugging/Toggle Breakpoint

First time the F9 is pressed a Red dot will appear next to the line:

Pressing twice the mark remains as an inactive breakpoint just to keep track of where the breakpoint used to be, but the debugger will not stop here.

How to activate the debugger if the license does not include Solution Developer

It is still possible to activate the debugger without license for Solution Developer. It is just a little more complicated. When the debugger is activated, the debug page is shown, however no breakpoint has been set and the debugger will therefore not be triggered unless an error occurs. If however, the Break button is pushed right after the debugger is activated, the debugger will stop at the next code that is executed.

This is also the way to stop the code in the middle of a user action. For example when the warning for exceeded credit limit and overdue payments is shown:

Click the Break and click Yes in the warning – and “Got You”:

Now the debugger has stopped right after the Credit limit warning.

The Debugging Window

The debugging window looks more or less like the old one:

The page consists of three sections:

  • The Code page
  • The Watch FactBox
  • And the Call Stack FactBox

In the code page, it is possible to see the code around the breakpoint and to see the actual position of the cursor. The new stuff is that it is possible to see the value of the variables just by hovering the mouse over the variable:

If there are more than one instance if the variable or if exists in different scopes, all instances will be shown.

The little + and the glasses enables you to add the variable to the Watch List FactBox. After clicking the glasses the variable appear in the Watch List until it is removed again. If something else is debugged, it will be shown as out-of-scope.

Now the value of Posting Date is visible to me throughout the debug session.

The Call Stack FactBox shows the path which the system has followed to end up at the breakpoint.

Clicking one of the lines below will show the function that the system executed to get to the present position. The Green arrow in the code page indicates that the position is not the actual one. Combined with a third window, the variables, it is possible to see the state and values of all variables, at the time of entering the function.

A second place to add a variable to the Watch List is by clicking the variable in the Variable window and then the Glasses in the upper left corner.

Conditional breakpoints

A new functionality is the conditional breakpoint. In the “Old Days” a little code snippet was inserted in the code:

Then the breakpoint would be set on the MESSAGE(’Hi’) line;

This is no longer necessary.

The conditional breakpoint can be created like an ordinary breakpoint. It will be shown as a breakpoint with a + inside:

Most simple datatypes can be created as conditional breakpoints. Exceptions are date, time and datetime fields. The only thing is, that the conditional breakpoint can only be set in the debugger window. Therefore, a normal breakpoint is usually first set to activate the debugger, then the conditional breakpoint can be set to replace the normal breakpoint.

Disable All Breakpoints

It is easy to get carried away setting lots of breakpoints in many different levels. Therefore, it is also good to be able to remove some or all breakpoints. The function Disable All will disable all breakpoints including conditional breakpoints. The disabled breakpoint can be enabled again by pressing F9 on the breakpoint.

Another function is the Breakpoints function, which will open a new window with a list of all breakpoints:

Here the state of each breakpoint can be seen and it can be enabled, disabled or deleted.

Break Rules

In the Break Rules window a number of great new properties appear:

If the debugger should not stop on errors, the check mark can be removed here. Now the debugger will only stop if a breakpoint has been set.

A check mark in the Break On Record Changes field will cause the debugger to stop first time a record changes.

Lastly, the debugger has been defaulted to skip all code in codeunit 1. This is to prevent the problems in previous version with the circular debugging due to the caption class functionality. If the debugging should include codeunit 1, the check mark must be removed here.

Starting/Stepping/Stopping the execution of the code

The functions to control the debugger are almost the same as in previous versions but with a number of welcomed additions:

Step Into (F11)

Execute the next line of code. If the line includes one or more functions then the debugger will step into the function showing and debugging the code in the function.

If there are more than one function in the line, the functions will be executed according to the precedence rules.

Step Over (F10)

Execute the next line of code. If the line includes one or more functions then the debugger will execute the code in the function like normal but remain on the line.

Steep Out (Shift+F11)

If the debugger stepped into the function by mistake or it is decided that there are no code in the function relevant to the problem that is being debugged, the Step Out button will execute the remaining code in the function like normal and then return to the line that originally called the function.


This will quit the debug session and execute the remaining code like normal. All breakpoints will remain and the debugger window will remain open ready for the next debugging.


No matter where the cursor is, the debugger will stop at the next line of code that is being executed.


Quit the debug session and fire an error and full roll back. All breakpoints will remain and the debugger window will remain open ready for the next debugging. This is particularly important, if the test scenario is difficult to recreate e.g. first time postings are made to a new item. By stopping the debugger, the test data will remain intact for future debugging.

Show Last Error

Will show the error that triggered the debugger.

Debugging Web services

Debugging Web services is a little trickier because of the relatively short time the web service session exists. When the session only exists in a few seconds, it is not possible to click debug on the session in the Sessions window. In this case, a separate instance only for web services and the run command will come in handy:

dynamicsnav://localhost/Webservice80/CRONUS Danmark A/S/runpage?page=9506

Now the session window will start up empty:

Now press Debug Next, and the debugging window will start up as normal. Next, it is possible to either press Break or wait for the debugger to break on a breakpoint previously set in the code the web service is supposed to execute. If the web services instance is the same as the instance for the NAS and the Windows Client, it is not recommended to press Break since any code executed by any client will trigger the debugger. In that case, a breakpoint in the web service code is the only option. It is also recommendable to disable break on errors, since any error made by any client will also trigger the debugger.

The best solution though, is to create a separate instance only for web services.

Debugging NAS services

In previous versions, it was a little difficult to debug the NAS. In the versions from 2013 and forward, all it takes is to log on to the NAS service instance with the Windows Client (or to run the Services window with the NAS instance name like described previously), start the debugger and use the same procedure as described above.




Automated preparation of Test Companies in Dynamics NAV

Posted March 1st, 2015 by Peik Bech-Andersen
Categories: Microsoft Dynamics NAV 2013

We all know the situation: A customer needs a copy of the live environment to run a number of tests, or it could be that a new test/development environment must be made periodically. Often it is necessary to update the test company with the latest data. Therefore, it is necessary to create a copy of the live database and restore it with a different name. Having done that, it must be prepared so that data from the test company cannot be confused with data from the live company.

The preparation can involve quite a number of things:

Change Company

The company must be renamed in Companies.

Rename the Company Name

The company name must be renamed in Company Information so that all sender addresses on external documents clearly reflects that is origins from a test company.

Change the System Indicator

I also change the System Indicator to tell the user that they are in a test environment, both by changing the text shown in the corner to show the word TEST clearly and to show the date the test company was based on, this will help the users later running tests in the system.

Giving this signature in the right corner of each page in the Windows client:

Redirect printers

It is also necessary to redirect the printers so picking lists from the test system are not suddenly printed in the warehouse or production orders in the production department. Even worse would be that external documents like purchase orders, sales quotes or orders accidently are printed and sent by mistake.

Redirect Freight integration

If integration with the shipping company is automated, then it is advisable to make sure that it is disabled in the test company. Otherwise, you will face some very angry truck drivers.

Stop Job Queues

There is no need to keep all jobs queue entries in the Job Queues running, like automatic posing of invoices or printing of invoices. So stop all the jobs that will give problems in a test environment.

Stop EDI flow

If the company communicates with external partners with EDI or the like, it is necessary to redirect all documents to a test folder.

Prevent Electronic Invoices

Many companies send invoices electronically, the paths that are uses for sending and receiving invoices and credit memos must be changed so nothing is sent or received by mistake.

Stop Electronic Payments

Payments sent and received are usually initiated manually, so the chances for mistakes are few. However, it is best practice to change all paths so the payments are not imported or exported to the bank by mistake.

Redirect Captured Documents

Capturing purchase invoices are commonly used in companies, and the paths used for the captured documents must be changed.

Stop or redirect Inter-Company Flow

If the company a part of a larger organization with an automated IC-flow, it is very often desirable to make sure that there are two IC-flows, one for the live transactions and one for the test environment, otherwise it can be difficult to test the Inter-Company functionality. In any case, the functionality must be redirected so the flow either stops or runs on the test environment instead.

Stop or redirect Master Data Replication

If the company a part of a larger organization with an automated Master Data Replication flow, it is just like the IC Flow desirable to make sure that there are two replication flows, one for the live replications and one for the test environment. The functionality must be redirected so the flow either stops or runs on the test environment instead.

Automate the tasks

These are only some of the changes that needs to be done. Others could be integration to production facilities, machines, BI databases or CRM systems. The list can be endless.

Now the customers will not be happy if they have to call a consultant to perform all these tasks, not to mention the actual backup/restore functionality every time they need a new test environment and even worse, to remember all the different tasks that needs to be done after the restore. Secondly, the number of databases that must be copied in a large organization can make the task so complex that errors will eventually happen.

Therefore, I have automated the creation of the test environment for my customers. It comes in two or three parts depending on which Dynamics NAV version the customer run:

  1. The backup of the live database and restore to the test environment

    This is not covered here in detail, but it can be performed in many ways:

  • Manually from SSMS
  • Automatically with a SQL job
  • Automatically with PowerShell (SQL Server 2012 & SQL Server 2014)


  1. The Company must be renamed in the test database. If the customer run Dynamics NAV 2009 or 2013, it must be done manually, but in Dynamics NAV 2013 R2 or 2015 there is a PowerShell CmdLet for exactly that


  2. All the rest must be performed inside the company in NAV. Build a codeunit to handle all the changes. If the customer run Dynamics NAV 2009 or 2013 it must be run manually, but in Dynamics NAV 2013 R2 or 2015 there is also a PowerShell CmdLet for to execute the codeunit

If the test company is in the same database as the live company, a simple PowerShell script can be run to insure that everything is done properly. This will work in Dynamics NAV 2013 R2 and 2015.

This is the script for real Men; it removes the company NO QUESTIONS ASKED. A gentler version could be without the –Force parameter on the Remove-NavCompany command. This will politely ask before removing the company. Of course, this could be done manually, but if we face 30 or 50 companies then the script is THE solution.

The CodeUnit

Creating the codeunit, usually involves a lot of “hard code”, here is an example:




Opening for charges purchased directly to the production

Posted February 26th, 2015 by Peik Bech-Andersen
Categories: Microsoft Dynamics NAV 2013

“Why can’t I post charges directly to a production order without making special operations and create the vendor as a subcontractor on a work center” is a question I often meet.

The answer: You can do just that, it just takes a little preparation and design access to the tables.

On a purchase order, it is possible to see which production order number and production order line has generated the purchase order. This is normal way when it is used for subcontracting. The information is set automatically when the purchase order has been created from the subcontractor worksheet.

However, sometimes charges for, e.g., freight are billed on a separate invoice from a vendor different from the actual subcontractor. In that case, it is convenient to be able to post a purchase order that is charged directly to the production because Item Charges cannot be used with purchase orders from subcontractors.

The three fields on the purchase order line defining the subcontractor are the Production Order No., Production Order Line No. and Work Center No.

By default, the three fields are not editable in the page.

To enable it, perform the following simple change:

  1. Go to the development environment
  2. Open the database, click the Table menu item
  3. Navigate to table number 39
  4. Locate the three fields numbered 5401, 99000752, and 99000754
  5. On each field go to the menu item View > Properties
  6. Find the property Editable and change it to “Yes”
  7. Save and exit.
  • Other fields are also available on the purchase order line and can be included, but the three mentioned are the only ones necessary.



Now the three fields are editable on the production order. Then I need to create one new Work Center for charges. The vendor chosen for the work center can be any random vendor the field just must be filled. An existing work center can also be used as long as it is a subcontractor.

Now I am ready to post charges directly to production orders.

The item number used on the purchase order must be the produced item from the production order.

So I create a purchase order with the vendor number from my freight vendor, the item number from the produced item and the new work center.

Amazing how versatile London Postmaster is.

Then I fill these three fields:

Post the purchase order and Navigate tells the story:

No Item Ledger Entries only Value entries. No items will be received. The cost will be treated as a service charged directly to the production Work in Progress (WIP).

From the production order, it looks like this:


In Finance it looks like this after the Inventory batch has run:

Here the cost ends up on account: 2140, which is the WIP account.

Keine Hexerei, Nur behendigkeit




Checking your Dynamics NAV License and object versions

Posted February 23rd, 2015 by Peik Bech-Andersen
Categories: Microsoft Dynamics NAV 2013

When I first started in Columbus NSC, I faced the usual license struggle, sent out on my first assignment:

  • What was included in the license?
  • Which extra object number series was included in the license?
  • Who is in charge of assigning object numbers?
  • How do we control the builds?

So I called my highly skilled colleague Martin Jordt Hansen who has been around in Columbus NSC for many years. His answer was to send me a small tool: The Check License objects.

Three small objects that made a big difference:

Running the main page: Check License will calculate all the information needed temporarily.

Why not just make a page showing the 2000000043 License Permissions table, you might ask?

Because the Check License objects shows a lot more information:

All the objects included in the license are shown here, but more important it is also shown which objects are already used. Here we can see that the next available object for tables are number 50012.

The real goodies are in the bottom of the page:

Here it will show all the latest version numbers in the database. This is one of the good ones, so we can read this from it:

  • It is a Dynamics NAV 2013
  • It is based on a W1 version with a Build 36005 (Rollup 10)
  • It is a Localized Danish database
  • It has IEM and BIS from To-Increase
  • It has Payment Management
  • Our own Build Number is NSC1.30
  • And a lot of other add-ons

Martin has graciously decided to share these objects and they are therefore available on

So go get it!




Posting freight lines automatically from the WMS Shipment in Dynamics NAV

Posted February 17th, 2015 by Peik Bech-Andersen
Categories: Microsoft Dynamics NAV 2013

Some of my customers using the Dynamics NAV Warehouse Management (WMS) module face the same issues with this WMS setup:

  1. The Sales Orders are created by the sales people and a freight line is applied, either as a G/L line or as a resource line.
  2. When the sales order is released for delivery, shipment lines are made based on the sales order, either by the sales person, by the warehouse manager or by the warehouse employee himself.
  3. Depending on the setup, a pick is created and posted to “transport” the item from the picking zone to the shipment zone.
  4. Lastly, the warehouse shipment is posted.
  5. At nighttime, all sales orders with posted shipments are automatically invoiced and invoices are printed/sent by email.

Using the WMS module, it means that the warehouse employees do not necessarily see the sales order because they have all necessary information on the Warehouse Shipment. However, only item lines are transferred to the warehouse shipment so all other types have to be handled manually on the sales order. Making an automation like this, will leave all G/L lines and resource lines behind on the sales order to be posted separately later. This also means that the customers will receive separate invoices: one for the items and one for the freight.

There is a fairly simple solution to the problem. It does demand a license to change codeunits though.

In the Warehouse setup table (5769), two fields are created:

  • Post G/L lines with Shipments
  • Post Resource lines with shipments.

Two lines if both G/L Lines and Resource lines should be posted otherwise one field is enough.

And the equivalent Warehouse Mgt. Setup page (5775):

In the Whse.-Post Shipment Codeunit (5763):

Find the function HandleSalesLine, make a local variable…

…and insert the Following lines just before the end of the function:


Now if the fields are activated in the setup table, freight lines made as G/L lines or resource lines on the sales order will always be included in the posting of the shipment, and thereby be included on the invoice.

To keep the solution as close to standard as possible, the fields could be created in a customer specific setup table and page. This way the only change will be in the codeunit.




Cancelling an active production order in Dynamics NAV

Posted February 14th, 2015 by Peik Bech-Andersen
Categories: Microsoft Dynamics NAV 2013

A production order can be created in four different statuses. In three of them, it is not a problem if a production order was created by mistake. The “Simulated”, “Planned” and “Firm Planned” production orders can be edited or deleted endlessly. However, the “Released” production orders can be edited and deleted freely until the first posting has been made. Then it is no longer possible to delete.

This is a problem if the components of the production order or the capacity of the routing of the production order has been set up with forward flushing. This will post the components or the time consumption when the production order is released or a released production order is refreshed. This also means that a production order that is accidentally released cannot be returned back to the previous status, and certainly not deleted.


Because entries exist on the production order. This means that one of the following tasks have been performed:

  1. Consumption has been transferred from inventory to Work in Progress (WIP). The consumption will create an item ledger entry and a value entry.
  2. Consumed time has been registered to the production order. This will create a capacity ledger entry and a value entry.
  3. Output has been registered to the production order. The output will create an item ledger entry and a value entry.

So in order to delete the production order, it is necessary to:

  1. Reverse all consumption transactions. (Watch out for the cost price of the returned item)
  2. Reverse all capacity ledger entries with the time consumptions
  3. Reverse all output transactions
  4. Change the quantity of the finished item to 1 and post the output
  5. Reverse the output.

This can be done in the production journal.

An example:

Production order 101041 has been created, and postings have been made by mistake.

There are all types of entries posted to the production order:

Item ledger entries, both consumption and output:

Capacity ledger entries:

So all transactions are reversed:

A special thing to remember is that the output of the last operation must be applied to the original output item ledger entry to ensure correct reversal. The consumption can be applied from the correct item ledger entry for the same reason.

This will reverse all costs on the production order:

Now all that is left is to change the quantity to 1 and post the output.

Is this really necessary? Yes, otherwise the Change Status on the production order will fail with this error:

So there is no way around.

Now the output has been posted with a value of zero:

This posting must be reversed in the item journal. Remember to apply to the output entry.

Now the production order can be changed to “Finished”.

Should this be an automated function? Possibly - if the components have been set up with forward flushing, then it could be an idea. Otherwise, it is okay that the reversal is a little bit difficult, just to encourage users to get it right the first time.