Popüler Yayınlar
25 Temmuz 2012 Çarşamba
Microsoft Dynamics NAV ile Google Apps Entegrasyonu/Integrating Google Apps with Microsoft Dynamics NAV
A framework has been created whereby NAV data can be sent to Google Apps. ArcherPoint or the community in general can build upon this framework in order to empower the entity to have as much synchronicity as is desired. The framework currently supports authentication on an individual basis and can even store username and password credentials for automatic re-authentication. A huge potential exists … imagine displaying one or more customers as markers on a Google map, linking to product images in Picasa, or even displaying grid-like form data in a Google spreadsheet.
Three videos have been created to demonstrate the framework. In this first video, a Contact record is uploaded to Google Apps:
This second video demonstrates how the Customer Online Map feature links to Google Maps; in addition, a new report creates markers on a Google Map for all customers matching a filter:
The third video demonstrates how the data in a List Page is uploaded to a spreadsheet in Google Drive:
Are you ready to begin? Download the framework and begin your integration with Google Apps!
You can also view these videos, along with many other Microsoft Dynamics NAV tutorials, on ArcherPoint's YouTube channel.
NAV Hesap Tabloları Kullanarak Nakit Akış Tabloları Oluşturma/Using NAV Account Schedules to create Cash Flow Statements
Using NAV Account Schedules to create Cash Flow Statements
This blog is an attempt to document how a Cash Flow Statement can be created using standard NAV Account Schedules.
I’ve been asked a few times by Finance Type individuals for a Cash Flow Statement report (yes there is not an out of the box report . . . darn!). It usually takes me a couple attempts to explain how to accomplish using Account Schedules, and I usually kick myself for not keeping an example on hand.
As some may know, there are two methods, Direct and Indirect, that can be used for a Cash Flow Statement. I understand the Indirect Method is the more common of the two, regardless I choose to do both in this blog. Other than the operating activities section, the methods are similar. A well-structured chart of accounts will greatly assist in setup/maintenance of this Account Schedule. You’ll see from my examples that I’m largely using Total Accounts to accomplish; hopefully this will eliminate the necessity to reconsider this account schedule if new accounts are setup in the future. For those of you that are new to NAV or have not yet implanted NAV, a good exercise may be to consider the necessary structure to accomplish a cash flow statement. This may dictate certain accounts/structure in your chart of accounts for your reporting requirements.
My examples are all from NAV 2013 BETA (CRONUS USA, Inc.), but these examples should apply to prior versions also. I’m sure my examples aren’t necessarily fully GAAP compliant, but I think you’ll understand the basics so that you can incorporate into your own schedules.
Indirect Method Account Schedule:
Indirect Method Account Schedule comments:
- Total Accounts, in Totaling Types column, used for many account schedule lines.
- In operating activities section, non-cash Depreciation and amortization has a unique row no. (095). Row No. 095 participates in two formulas, Net Cash flows from operating activities line and Net increase in cash and cash equivalents line.
- Row Type of Beginning Balance for Cash and cash equivalents, beginning of period line.
- Show column is used to show only certain rows on printed report.
- Check Total Lines which will only show up on the printed report if the report is out of balance with cash accounts (certainly not necessary – but I like to have check totals just in case).
Direct Method Account Schedule comments:
- The operating activities section is where the differences reside as compared to the Indirect Method, other sections are largely the same.
- Note that I choose to have many rows set with Show = No. I found when reconciling it was nice to see those rows in the Account Schedule Overview (those rows won’t show on the printed report).
Column Layout comments:
- Again, I choose to Show certain columns as Never so that I could see in Account Schedule Overview, but not on the printed report. Again, this is very useful when setting up and reconciling the report.
Account Schedule Overview comments:
- Again, here you’ll be able to see columns and rows even if you set Show = Never in column layout and Show = No in account schedule. Those settings only apply to the printed report.
I hope this blog helps provide you an example you can follow to establish your own Cash Flow Statement from standard NAV functionality.
http://www.archerpoint.com/blog/Posts/using-nav-account-schedules-create-cash-flow-statements?goback=%2Egde_1134_member_136386077
17 Temmuz 2012 Salı
Yeni TTK’daki 55 Maddelik Değişiklik
55 Maddelik Değişiklik
Türk Ticaret Kanunu ile ilgili 55 maddelik değişiklik gerçekleşti. Kısaca değişiklikler şöyle olacaktır.
İşlem Denetçisi Kaldırılmıştır: İşlem denetçisinin sorumlulukları Yönetim Kurulu’na, görevleri bilirkişilere, bakanlık görevlilerine, uzman kuruluşlara bırakılmıştır.
TFRS’lere Göre Defter Tutma Kanun Metninden Çıkarılmıştır: Kanunda yer alan defterlerin Türkiye Finansal Raporlama Standartlarına göre tutulması zorunluluğu metinden çıkarılmıştır. 64 üncü maddeye eklenen 5. Paragraf ile kanuna tabi tüm gerçek ve tüzel kişilerin defterlerini tutarken vergi usul kanununa uymak zorunda olduğu açıkça belirtilmiştir.
TFRS’lere göre Mali Tablo Düzenleme Zorunluluğu devam etmektedir: Tüm şirketlerin mali tablolarını ölçeklerine göre Türkiye Finansal Raporlama Standartlarına göre veya Kamu Gözetim Kurumu tarafından TFRS’nın Kavramsal Çerçevesine aykırı olmamak kaydıyla çıkaracağı Raporlama standartlarına göre Mali Tablolarını düzenleme zorunluluğu aynen devam etmektedir. Yani tüm gerçek veya tüzel kişiler her raporlama döneminde TFRS’lere veya TFRS kavramsal çerçevesine aykırı olmamak kaydıyla muhasebe standartlarına uygun finansal tablo düzenlemek zorundadır.
Bağımsız Denetim Zorunluluğu sadece Bakanlar Kurulu tarafından belirlenecek Şirketlere uygulanacaktır: En önemli değişiklik denetim kapsamının daraltılmasıdır. Kimin bağımsız denetime tabi olacağına sadece ve sadece bakanlar kurulu karar verecektir.
Denetçinin SMMM veya YMM Olması zorunluluğu Kanun Metninden Çıkartılmaktadır: Kanundaki en önemli değişikliklerden birisi de kanun metninden Bağımsız Denetçinin SMMM veya YMM olması zorunluluğunun ortadan kaldırılarak, denetçinin kim olacağı ile ilgili tüm yetkinin Kamu Gözetim Kurumu’na bırakılmasıdır. Esasında Kamu Gözetim Kurumu’nu ortaya çıkaran 660 sayılı Kanun hükmünde kararnameye bakıldığında Kamu Gözetim Kurumu bağımsız denetçileri sadece meslek mensupları arasından seçebilmektedir. Tasarı aynen kanunlaşsa dahi bağımsız denetim yetkilisi olarak seçilebilecek kişiler 660 sayılı kanun hükmünde kararname gereği SMMM veya YMM’lerdir.
Denetçinin Görüşü ile Yönetim Kurulunu Seçime Zorlama Uygulaması Devam Ediyor: Denetçinin görüşü ile yönetim kurulunu genel kurula ve seçime zorlayabilmesi hakkı tasarıda devam etmektedir. Ancak burada görüş bildirmekten kaçınma hali kaldırılmış sadece olumsuz görüş verilmesi halinde bu hakkın olması uygulaması getirilmiştir.
TFRS Kavramsal Çerçevesine Uygun Finansal Tablo Zorunluluğu: Türkiye’deki tüm gerçek veya tüzel kişiler vergiye esas kayıtlarından ortaya çıkan finansal tabloları dışında TFRS’ye veya Kamu Gözetim Kurumu tarafından TFRS kavramsal çerçevesine aykırı olmamak kaydıyla çıkarılan Muhasebe Standartlarına uygun finansal tablo düzenlemek zorundadır. Bu finansal tablolar bakanlar kurulu tarafından belirlenen şirketler için bağımsız denetime de tabidir.
Şirkete Borçlanma
Yasağı: Şirkete Borçlanma yasağı neredeyse tamamen kaldırılmıştır. Yapılan düzenleme ile şirkete sermaye borcu olan ortaklar dışında borçlanma serbest bırakılmaktadır.
İnternet Sitesi Zorunluluğu: İnternet sitesinin kapsamı oldukça daraltılmış ve sadece bağımsız denetime tabi şirketler için zorunlu tutulmuştur.
Etiketler:
DYNAMICS NAV,
IFRS,
Microsoft Dynamics Navision,
NAV 2013,
NAV 7,
TFRS,
TTK,
UFRS,
Yeni TTK
16 Temmuz 2012 Pazartesi
Güvenebileceğiniz Deneyim , Tradesoft Uzmanlığı
Dynamics
NAV Tecrübemiz
Dynamics NAV / Navision Tecrübesi | |
---|---|
Dynamics NAV (Navision) Tecrübesi | Toplamda 94 yıldan fazla |
ERP Tecrübesi | Toplamda 130 yıldan fazla |
Dynamics NAV (Navision) implementasyonları | 50’den fazla |
Dynamics NAV (Navision) Upgrade – Versiyon Yükseltme | 20’den fazla |
Web / eCommerce Entegrasyonları | 20+ |
Barkod/RFID | 10+ |
Kredi Kartı İşlemleri Entegrasyonları | Türkiye’de en yaygın 6 banka ile |
POS Çözümleri Entegrasyonları | 500+ satış noktası |
EDI (Electronic Data Interchange) | 30+ |
Diğer Entegrasyonlar | 10+ |
Dynamics NAV / Navision Sertifikasyonları |
---|
Financial Management |
Manufacturing |
Warehouse Management |
Advanced Distribution (Trade) |
Trade & Inventory |
CRM / Relationship Management |
Service Management |
C/SIDE Introduction |
C/SIDE Solution Development |
Installation & Configuration |
SQL Server |
Sure Step – Implementation |
Project Management |
Dynamics NAV Uygulamaları |
---|
Finans |
Muhasebe |
Hesap Tabloları |
Bütçe |
Tahmin |
Konsolidasyon |
Çoklu Şirket |
Çoklu Döviz |
Satış Sipariş İşlemleri |
TR Bordro |
Alım Siparişi İşlemleri |
Üretim |
Dağıtım |
Ambar Yönetimi |
Sevkiyat İşlemleri |
Teslim Alma İşlemleri |
Barkod |
Maliyet |
Profesyonel Servisler |
Çalışma ve Harcama Girişleri |
Proje |
Proje Maliyeti |
Dashboard , KPI |
Sorumluluk Merkezleri |
Dynamics NAV Versiyon Geçişleri / Versiyonlar 20+ Microsoft Dynamics NAV versiyon geçişi gerçekleştirdik. Aşağıdaki versiyonlarda tecrübelere sahibiz ; |
---|
Dynamics NAV Versiyonları : |
Navision Financials 2.60 |
Navision Attain 3.10A, 3.70B |
Microsoft Business Solutions Navision 4.0, 4.0 SP1, 4.0 SP2, 4.0 SP3 |
Microsoft Dynamics NAV 5.0, 5.01,5.02 |
Microsoft Dynamics NAV 2009, 2009 SP1, 2009 R2 |
Dynamics NAV Entegrasyonları |
---|
Logo Tiger (Türkiye’deki en yaygın yerel ERP sağlayıcısı) |
Çoks ayıda Web Sitesi , e-ticaret portalı |
Microsoft CRM |
Dijital Arşivleme Sistemleri |
Tedarikçi Sistemleri (Adidas,Reebok,Nike) |
DBS (Doğrudan Borçlandırma Sistemi) |
Kredit Kartı İşlemleri (yaygın bankalar ile) |
Yemek Çekleri (sodexo , ticket, vb.) |
Ödeme sistemleri (yaygın bankalar ile) |
Yasal Bildirimler (beyannameler) |
Dynamics NAV Teknolojileri |
---|
SQL Server ve Hardware Server Setup, Architecture, Optimization |
Microsoft Dynamics NAV on Virtual Machines (VM Ware ve Virtual Server) |
Remote Access ve Support |
Terminal Server, Citrix |
SharePoint Entegrasyonu |
Microsoft Dynamics NAV ile XML entegrasyonu |
SaaS / Private Cloud / Hosted Microsoft Dynamics NAV Uygulamaları |
Web Servisleri |
SOAP |
.NET |
C++ |
Dynamics NAV üzerinde yapılan geliştirme örnekleri ; |
---|
Hesap Tabloları |
Boyutlara Göre Analizler |
Özel Raporlar |
Dashboard , Grafik , KPI |
Veri Aktarımı |
Çoklı NAS Kullanımı |
Web Servis Kullanımları |
Otomatik Fatura Aktarımları |
Excel Şablonlarından Veri Girişleri |
Atık/Fire Yönetimi |
Reporting Services (SSRS) |
Ekstre ve Faturaların e-mail olarak Gödnerimi |
Nakit Akış |
Konsolide Raporlar |
Nakit Çekim Yönetimi |
Fiziksel Envanter |
Stok Sayımları |
İhtiyaç Planlama |
Tedarik Planlama |
Makine Merkezi Optimizasyonu |
Ambar İşlemleri |
Özel Sabit Kıymetlerin Yönetimi |
Teminatlar |
Temlikler |
KDV Yönetimi |
KDV İstisnaları |
KDV İadeleri |
Kur Farkları |
Doküman Yönetimi |
Banka Mutabakatları |
Grup İçin Şirketlerarası İşlemler |
Çalışma Zamanı ve Masraf Yönetimi |
Kira Yönetimi |
Royalty Gelirleri ve Raporlaması |
Tradesoft aşağıdaki programlardan Dynamics NAV geçişi gerçekleştirdi |
---|
Programlar : |
Logo Unity |
Logo Tiger |
Logo Go |
Logo Gold |
Netsis |
Mikro |
Uyumsoft |
Orka |
EuroPro |
Link |
Set |
15 Temmuz 2012 Pazar
NAV 2013'de G/M Hesap Adı Alanının Uzunluğu
Length of G/L Account Name in NAV 2013
by Vjekoslav Babic on June 10, 2012
A small but important change often slips under the radar of the What’s New kinds of documents. One of those is the standard length of the Name field in G/L Account table. I’ve just noticed that in Microsoft Dynamics NAV 2013 the length of this field has been increased from 30 characters to 50 characters.
While this seams a minor thing, it’s actually a huge improvement. If 30 characters was not enough in previous versions, increasing it was not a simple thing to do, and required you to change thirty or so other objects as well. It was in fact one of those annoying things that you better got used to, rather than changed. Yes, I’ve seen customers who insisted on changing it, but most of them simply gave in.
In NAV 2013, this change is not only about G/L Account – the length of all Name and Description fields in all master tables has been consistently set to 50. In the previous versions of NAV the length varied between 30 and 50, but now all of the master table Name and Description fields are of length 50.
A small step for man, a giant leap for mankind.
http://navigateintosuccess.com/blog/length-of-gl-account-name-in-nav-2013
While this seams a minor thing, it’s actually a huge improvement. If 30 characters was not enough in previous versions, increasing it was not a simple thing to do, and required you to change thirty or so other objects as well. It was in fact one of those annoying things that you better got used to, rather than changed. Yes, I’ve seen customers who insisted on changing it, but most of them simply gave in.
In NAV 2013, this change is not only about G/L Account – the length of all Name and Description fields in all master tables has been consistently set to 50. In the previous versions of NAV the length varied between 30 and 50, but now all of the master table Name and Description fields are of length 50.
A small step for man, a giant leap for mankind.
http://navigateintosuccess.com/blog/length-of-gl-account-name-in-nav-2013
NAV 2013 ile yapılan en iyi 5 SQL Server İyileştirmeleri
Top 5 SQL Server Improvements in NAV 2013
by Vjekoslav Babic on June 21, 2012
Performance is one of those things you can’t get enough of and NAV is one of those systems where an extra operation per second is always welcome. Yesterday, during the Expert Panel at the NAV day of the Decisions Spring conference, there was a question: is there any improvement in how NAV 2013 works on SQL Server.
And the answer is: oh yeah!
As a matter of fact, everything is new and improved.
Jörg has already posted an overview of the news of NAV on SQL Server in his last blog post, but I still think there’s room for a couple of more words on the really amazing palette of news and improvements.
As I said, the SQL Server improvements are plenty. Here’s the list of the top 5 technical improvements that rock my boat.
1. Cursors are gone
If there was a single thing that was killing performance in NAV, that was server-side cursors. The burden on SQL Server, especially in critical multi-user environments was tremendous, and I’ve seen server monsters crawling under pressure. The cursors are replaced with MARS (Multiple Active Result Sets), which basically take the browsing through recordset chore away from the SQL and assign it to the NST.
2. Caching
Apart from MARS, another killer improvement is the caching. Most of data access operations are cached on the NST, which results in a considerable reduction in the number of SQL Server calls. Now, caching alone is a great improvement, but caching + MARS is a winner.
Try profiling a simple thing, such as this:
Run it a couple of times in a row. Under NAV 2013, you get a single SELECT against the SQL Server, then nothing else. The iteration happens on the NST, and every consecutive call to the same stuff does everything on the NST. Try that under NAV 2009, and the profiler goes bananas.
3. SIFTs
There are several improvements in how NAV 2013 handles SIFTs. First – you don’t have to explicitly declare SIFT fields on keys. You can do CALCFIELDS and CALCSUMS on any decimal field, regardless of the structure of keys on the source table. And SQL simply calculates the value. This relieves SQL from maintaining too many indexed views. Yes, I know, it also slows the read operations slightly, but did I mention the caching? Oh, sorry, I have. There.
Another improvement is that you can include the SIFT fields into the SQL statement, and get the SIFTs with the same single SELECT statement that NST issues against SQL. You do this with the SETAUTOCALCFIELDS statement which you call on a record variable just before you FIND or FINDSET the records.
Compare these two in the profiler, and it’s clear right away:
a) with CALCFIELDS
b) with SETAUTOCALCFIELDS
With the option a, whenever you hit the CALCFIELDS, the NST obeys and fetches the sum. With the option b, there is a single SELECT statement, which already includes the OUTER APPLY clause, which calculates the SUM for each row retrieved.
Pretty cool stuff.
4. ADO.NET
The whole shebang is now run on ADO.NET, instead of OLEDB/ODBC that it was before. There are plenty of benefits of that, performance included.
ADO.NET streamlines deployment and administration, increases performance, reduces the number of SQL connections (Jörg has explained some drawbacks of this access, but I think generally that this is a good thing), reduces the memory consumption, and maybe a couple other things.
5. Unicode
I’ve already blogged about this, Jörg has also mentioned this, so I won’t play the same tune yet another time. NAV is now Unicode, which allows you to store characters in any language, at the same time.
Unfortunately, Unicode is not as Unicode as I’d truly love it to be, because the object captions remain tied to the chosen database collation (yes, you still need to choose this). That practically means that while you’ll be able to store characters from any alphabet, your RTC user interface will remain limited to a single character set.
Wrap up
So, to wrap it up, there is a lot of new things, bigger or smaller, that have been changed and that warrant better performance, or user experience, or both.
You may notice that I didn’t mention queries. Yes, they are a mind-boggling improvement over previous versions, but they are simply a completely new feature, not something that NAV had, and now has better than before. My list here is the list of tweaks and tune-ups that take those things that we are used to have to a new level altogether. Queries? Well, they are out of this world, but their true power is yet to come – when (I’m kind of sure it’s about “when”, not “if”) we’ll be able to use them as sources for pages or reports.
http://navigateintosuccess.com/blog/top-5-sql-server-improvements-in-nav-2013
And the answer is: oh yeah!
As a matter of fact, everything is new and improved.
Jörg has already posted an overview of the news of NAV on SQL Server in his last blog post, but I still think there’s room for a couple of more words on the really amazing palette of news and improvements.
As I said, the SQL Server improvements are plenty. Here’s the list of the top 5 technical improvements that rock my boat.
1. Cursors are gone
If there was a single thing that was killing performance in NAV, that was server-side cursors. The burden on SQL Server, especially in critical multi-user environments was tremendous, and I’ve seen server monsters crawling under pressure. The cursors are replaced with MARS (Multiple Active Result Sets), which basically take the browsing through recordset chore away from the SQL and assign it to the NST.
2. Caching
Apart from MARS, another killer improvement is the caching. Most of data access operations are cached on the NST, which results in a considerable reduction in the number of SQL Server calls. Now, caching alone is a great improvement, but caching + MARS is a winner.
Try profiling a simple thing, such as this:
IF Cust.FINDSET THEN
REPEAT
UNTIL Cust.NEXT = 0;
3. SIFTs
There are several improvements in how NAV 2013 handles SIFTs. First – you don’t have to explicitly declare SIFT fields on keys. You can do CALCFIELDS and CALCSUMS on any decimal field, regardless of the structure of keys on the source table. And SQL simply calculates the value. This relieves SQL from maintaining too many indexed views. Yes, I know, it also slows the read operations slightly, but did I mention the caching? Oh, sorry, I have. There.
Another improvement is that you can include the SIFT fields into the SQL statement, and get the SIFTs with the same single SELECT statement that NST issues against SQL. You do this with the SETAUTOCALCFIELDS statement which you call on a record variable just before you FIND or FINDSET the records.
Compare these two in the profiler, and it’s clear right away:
a) with CALCFIELDS
IF Cust.FINDSET THEN
REPEAT
// Balance is not calculated, we have to do it manually
Cust.CALCFIELDS(Balance);
UNTIL Cust.NEXT = 0;
Cust.SETAUTOCALCFIELDS(Balance);
IF Cust.FINDSET THEN
REPEAT
// No need for CALCFIELDS, Balance is returned already
UNTIL Cust.NEXT = 0;
Pretty cool stuff.
4. ADO.NET
The whole shebang is now run on ADO.NET, instead of OLEDB/ODBC that it was before. There are plenty of benefits of that, performance included.
ADO.NET streamlines deployment and administration, increases performance, reduces the number of SQL connections (Jörg has explained some drawbacks of this access, but I think generally that this is a good thing), reduces the memory consumption, and maybe a couple other things.
5. Unicode
I’ve already blogged about this, Jörg has also mentioned this, so I won’t play the same tune yet another time. NAV is now Unicode, which allows you to store characters in any language, at the same time.
Unfortunately, Unicode is not as Unicode as I’d truly love it to be, because the object captions remain tied to the chosen database collation (yes, you still need to choose this). That practically means that while you’ll be able to store characters from any alphabet, your RTC user interface will remain limited to a single character set.
Wrap up
So, to wrap it up, there is a lot of new things, bigger or smaller, that have been changed and that warrant better performance, or user experience, or both.
You may notice that I didn’t mention queries. Yes, they are a mind-boggling improvement over previous versions, but they are simply a completely new feature, not something that NAV had, and now has better than before. My list here is the list of tweaks and tune-ups that take those things that we are used to have to a new level altogether. Queries? Well, they are out of this world, but their true power is yet to come – when (I’m kind of sure it’s about “when”, not “if”) we’ll be able to use them as sources for pages or reports.
http://navigateintosuccess.com/blog/top-5-sql-server-improvements-in-nav-2013
NAV 2013 Performans Karşılaştırmaları
Benchmarking Results: NAV 2013 Outperforms All Previous Versions
by Vjekoslav Babic on June 25, 2012
Marketing is nice as long as it matches the reality. With Microsoft Dynamics NAV 2013, Microsoft has promised a lot of improvements, but how well does NAV 2013 stand the reality test?
Apparently, outstandingly well.
Over the past two days, I have intensively tested NAV 2009 and NAV 2013 through a series of five different tests that measure different aspects of NAV data handling. My conclusion is clear: NAV 2013 is faster than any NAV you have ever seen, including the Classic client on the native database.
Continue reading to find out more about my findings and testing approach.
Is This Some Kind Of A Trick?
No, this is not a trick. It’s for real.
Several days ago I wrote about performance improvements in Microsoft Dynamics NAV 2013, and then got a comment that it all looks nice in theory, but that NAV 2013 is actually slower than NAV 2009. Per Mogensen of Mergetool.com has done some testing and published a video demonstrating the results.
I’ve reviewed the video, and I’ve noticed a couple of possible issues with how the performance was measured, so I decided to do check for myself. My results show something completely different: not only NAV 2013 is faster than NAV 2009, it’s also faster than the Classic client on the native database – kind of a holy grail of NAV performance.
And then I double-checked with Per, and he confirmed to me that he also noticed a couple of problems himself. He has repeated the tests, and his tests now also show great improvement in NAV 2013. His updated video is here.
But, let’s continue with my results.
The Racing Horses
To find out how fast NAV 2013 really is, I’ve compared it to other flavors of NAV. The racing horses were:
The Environment
All of the applications have had exactly the same operating conditions, under exactly the same environment and system configuration settings. The following are the system specifications:
The Tests
All of the six applications had to endure the same testing conditions, and have run the following tests:
Before each of the tests, I prepared the environment by doing the following:
Each of the tests records the time right before the test starts, and then again right after it ends. The time difference is then logged into the database.
I measured the time by creating two DateTime instances, setting them to current system time, then subtracting the start time from the end time. This gives the duration in milliseconds. In addition to this, under NAV 2013 I’ve added another measurement method: the .NET System.Diagnostics.Stopwatch class, just in case – if there is anything flawed with NAV’s time variable in 2013, certainly nothing will be wrong with the .NET Stopwatch. As expected, there was no difference between what NAV calculated and what the System.Diagnostics.Stopwatch measured.
In the results, all measurements I present are in milliseconds, and in all test results I’ll show, less is better.
The Results
Finally, we get to the point which I believe you await as much as did I: the results. Let me present them test by test.
1. Sales Orders
In the Per Mogensen’s tests, the NAV 2009 Classic Client on a native database is the winner of this test. At pure C/AL level, NAV 2013 there performs almost as fast as Classic on native, but the RTC under NAV 2013 is still the slowest. My results are very different. I can’t be 100% sure why, but I’ll give a couple of thoughts at the end of this post.
In any case, these are the measurements I got:
A picture is worth a thousand words, so here it comes:
But NAV 2013 RTC also showed respectable performance. It performed 21% better than NAV 2009 Classic Client on native database. I kind of didn’t expect this to occur, because the Classic Client on native database is a native ISAM system and NAV business logic is entirely optimized to fly on it. What astonishes me is 128% improvement of NAV 2013 over NAV 2009 in Web Services, or 137% improvement in RoleTailored Client performance. That’s truly amazing.
Obviously, NAV 2013 provides considerable improvement over NAV 2009.
2. Repeated Read
This test measures the capability of a client to iterate through a series of records. Iteration is something that C/AL code frequently does, and where any flavor of NAV somewhat sucked under SQL Server, as compared with the sheer performance of the native database. Again, native database and C/AL as a language are optimized precisely for this kind of access, and it was never a wonder that the native was a king here.
However, NAV 2013 seems to have just deposed that king:
Graphically, this is how it looks:
I don’t want to spend any time comparing the speed of NAV 2013 with the speed of NAV 2009 native; what I want to do is point out the speed improvement by a factor of more than 400x over NAV 2009 on SQL. How cool is that?
3. Repeated Read of Filtered Tables
The beauty of this test is that it shows how well a system copes with a complex filter. I’ve set the filter on Name and Description columns on Customer, Vendor and Item table respectively to this: @*a* (it searches for letter a anywhere in the field, in a case-insensitive way).
This filter can’t make meaningful use of any key, so what shall win or lose this race will be the capability of the database management system to handle such a process on foot.
Again, NAV 2013 played this one coolly.
Here go the results:
This is the graph:
4. Reading Unique G/L Account Numbers from G/L Entry
Now, this was a tricky one. It uses several concepts, combination of which is a total no-brainer for the ISAM-based NAV 2009 on native, but verges on rocket science for anything SQL-related. It was literally the most inefficient thing to do to a SQL database in NAV, and running a piece of code such as this literally smothers SQL by causing it to drop existing and create new cursors all the time.
The algorithm is as follows:
And this is exactly what the results show:
And then, the picture:
While I have a very plausible explanation what made NAV 2013 win all previous tests, I don’t have a faintest idea what kind of magic made it perform this well here. Yes, it is slower than native, but this was kind of like making a Formula One compete in a rally.
Catch this: native is fully optimized to do this kind of access, and doing this is no smarter for it than doing the simplest kind of data iteration. As a matter of fact, since there were less rows to read, this one should have been faster than the repeated read test. And it was. At the same time, NAV 2009 on SQL was slower here, because this put much more pressure on it, and it had to struggle. And struggle it did.
Yet, NAV 2013, while still struggling, has shown an incredible performance improvement to make even this kind of thing perform well. Quite a job, Microsoft!
5. SIFT Read
I measured how various systems perform with SIFTs, in a scenario quite common in real life: iterating through a set of data and calculating flow fields for each row. NAV does this in many situations, and I was very curious to find out how fast NAV 2013 would be here, because of the many changes Microsoft has done in handling the flow fields in NAV 2013.
Here are the results:
Or graphically:
Okay, I assume that some serious caching took place here as well, but still, caching or not, the whole system performs better and faster in NAV 2013. Compared to SQL Server flavors in NAV 2009, the improvement of 532% is quite amazing, and even more so if you think that probably everybody thought that Microsoft has hit the limit with replacing SIFT tables with indexed views in 5.0 SP1. With that obviously not having been a limit at all, I now wonder shall we experience even more improvement here in the future?
5a. SETAUTOCALCFIELDS
Finally, I ran the same test as the previous one, with the SETAUTOCALCFIELDS. I expected serious improvement, but at average of 1,466 milliseconds, this test performed practically only insignificantly faster than the previous one. I expected this one to show the real improvement over the traditional CALCFIELDS approach, but it stubbornly declined. I can’t explain this, but hey, let’s not get too picky
Overall Results
When you add all of the figures above together, the cumulative results demonstrate that Microsoft Dynamics NAV 2013 outperforms its previous incarnations, including the so-far unbeatable Classic Client on a native database.
On average, this is what it took the three clients to execute all tests:
And the last picture of the day:
However, this is only a part of the story. There is another one: concurrency. Performance is always welcome, but performance is not what has been preventing NAV to scale as much as, for example, AX could. I wonder if Microsoft will release a hardware sizing document that would estimate some kind of the upper limit for vertical scalability of NAV 2013. The last time we got such numbers from Microsoft was with version 5.0 SP1, when it was set at 250 concurrent users.
Of course, any estimates of the kind are comparing apples to oranges, anyway, because at that number of users, the application is probably always heavily customized, and the actual upper vertical scalability limit will invariably depend on a very complex set of parameters, and can be determined only on a case-by-case basis.
I would’ve loved to have done concurrency tests together with the performance tests, but I may do that another time. However, based on the figures I see here, I dare estimating that everything else being equal, concurrency levels can at least be doubled in any given NAV 2013 deployment, over an equal NAV 2009 deployment.
But What About The Other Test?
So, why do Per Mogensen’s test show somewhat different results? On the C/AL level, hist test is very consistent with my measurements with Web services, but in Per’s tests, NAV 2013 performance with RTC is still inferior to all other clients and platforms.
I can’t tell for sure, but I’ll give my best guess:
So, who do you trust, Per or me? Neither one! Please, don’t just take my findings for granted. Do the measurements yourself.
Here, I’ve attached the objects that I’ve used to run the benchmark, so you can run the same tests on your own machine, and see your own results. I am really curious about the results you’ll get.
So, download the objects:
The reason why there are three distinct sets of objects is that NAV 2013 uses .NET Interoperability in addition to system time to measure time, and that native doesn’t use role centers. Everything else is exactly the same.
(Just in case you need it, here is also my Excel sheet with testing results and charts.)
Run the tests, and then come back here and share your findings. I’d love to hear from you!
Apparently, outstandingly well.
Over the past two days, I have intensively tested NAV 2009 and NAV 2013 through a series of five different tests that measure different aspects of NAV data handling. My conclusion is clear: NAV 2013 is faster than any NAV you have ever seen, including the Classic client on the native database.
Continue reading to find out more about my findings and testing approach.
Is This Some Kind Of A Trick?
No, this is not a trick. It’s for real.
Several days ago I wrote about performance improvements in Microsoft Dynamics NAV 2013, and then got a comment that it all looks nice in theory, but that NAV 2013 is actually slower than NAV 2009. Per Mogensen of Mergetool.com has done some testing and published a video demonstrating the results.
I’ve reviewed the video, and I’ve noticed a couple of possible issues with how the performance was measured, so I decided to do check for myself. My results show something completely different: not only NAV 2013 is faster than NAV 2009, it’s also faster than the Classic client on the native database – kind of a holy grail of NAV performance.
And then I double-checked with Per, and he confirmed to me that he also noticed a couple of problems himself. He has repeated the tests, and his tests now also show great improvement in NAV 2013. His updated video is here.
But, let’s continue with my results.
The Racing Horses
To find out how fast NAV 2013 really is, I’ve compared it to other flavors of NAV. The racing horses were:
- Microsoft Dynamics NAV 2013 RoleTailored Client
- Microsoft Dynamics NAV 2013 Web Services
- Microsoft Dynamics NAV 2009 RoleTailored Client
- Microsoft Dynamics NAV 2009 Web Services
- Microsoft Dynamics NAV 2009 Classic Client – SQL Server Option
- Microsoft Dynamics NAV 2009 Classic Client – Native Database Option
The Environment
All of the applications have had exactly the same operating conditions, under exactly the same environment and system configuration settings. The following are the system specifications:
- Intel Core i7-2620M CPU (Quad Core)
- 8 GB of RAM
- OCZ Vertex2 SSD drive
- Windows 7 Ultimate, Service Pack 1, 64-bit
- Microsoft SQL Server 2008 R2, Standard Edition, 64-bit
The Tests
All of the six applications had to endure the same testing conditions, and have run the following tests:
- Creating, releasing, shipping and invoicing a sales order, 100 times in a row (the original Per Mogensen’s test)
- Iterating through all customers, vendors, and items, 500 times in a row
- Iterating through a filtered list of customers, vendors, and items, with an inefficient filter over a text field, 500 times in a row
- Iterating through a unique list of G/L accounts from the G/L Entry table, 500 times in a row.
- Manually summing flow fields of all customers, vendors, and items, by calling CALCFIELDS on each row, 500 times in a row.
- Manually summing the balance and inventory flow fields of all customers, vendors, and items, respectively, by calling SETAUTOCALCFIELDS before the iteration, 500 times in a row.
Before each of the tests, I prepared the environment by doing the following:
- I stopped all instances of NAV and closed all clients and made sure no applications were running.
- I created a new empty database.
- I restored the W1 database into the just created database.
- I started the relevant service tier and clients, and then ran all the tests three times to warm the system up.
- I cleared the time logs to eliminate the warm-up results, and make sure they don’t distort the test results.
- Closed any unnecessary applications (e.g. the Classic Client before using the RTC to run the tests) to ensure that only the environment which is running the test is open.
- Ran the test three times in a row.
- Copied the results from the log table into Excel.
Each of the tests records the time right before the test starts, and then again right after it ends. The time difference is then logged into the database.
I measured the time by creating two DateTime instances, setting them to current system time, then subtracting the start time from the end time. This gives the duration in milliseconds. In addition to this, under NAV 2013 I’ve added another measurement method: the .NET System.Diagnostics.Stopwatch class, just in case – if there is anything flawed with NAV’s time variable in 2013, certainly nothing will be wrong with the .NET Stopwatch. As expected, there was no difference between what NAV calculated and what the System.Diagnostics.Stopwatch measured.
In the results, all measurements I present are in milliseconds, and in all test results I’ll show, less is better.
The Results
Finally, we get to the point which I believe you await as much as did I: the results. Let me present them test by test.
1. Sales Orders
In the Per Mogensen’s tests, the NAV 2009 Classic Client on a native database is the winner of this test. At pure C/AL level, NAV 2013 there performs almost as fast as Classic on native, but the RTC under NAV 2013 is still the slowest. My results are very different. I can’t be 100% sure why, but I’ll give a couple of thoughts at the end of this post.
In any case, these are the measurements I got:
2013, Web Services |
5,169
|
2013, RoleTailored Client |
6,186
|
2009, Classic Client, Native |
7,467
|
2009, Web Services |
11,778
|
2009, Classic Client, SQL |
14,420
|
2009, RoleTailored Client |
14,690
|
A picture is worth a thousand words, so here it comes:
Image 1: Sales Orders test results
As I expected, the Web Services perform faster on both 2009 and 2013, because there is no user interface and only the NST is involved in code execution. Under Web Services, NAV 2013 performs about 44% faster than the fastest breed of NAV ever – the Classic Client on a native database. Stripped off the burden of a UI, NAV 2013 Web Services practically demonstrate pure SQL Server performance, and SQL Server is faster than ever before, just as it says on the tin.But NAV 2013 RTC also showed respectable performance. It performed 21% better than NAV 2009 Classic Client on native database. I kind of didn’t expect this to occur, because the Classic Client on native database is a native ISAM system and NAV business logic is entirely optimized to fly on it. What astonishes me is 128% improvement of NAV 2013 over NAV 2009 in Web Services, or 137% improvement in RoleTailored Client performance. That’s truly amazing.
Obviously, NAV 2013 provides considerable improvement over NAV 2009.
2. Repeated Read
This test measures the capability of a client to iterate through a series of records. Iteration is something that C/AL code frequently does, and where any flavor of NAV somewhat sucked under SQL Server, as compared with the sheer performance of the native database. Again, native database and C/AL as a language are optimized precisely for this kind of access, and it was never a wonder that the native was a king here.
However, NAV 2013 seems to have just deposed that king:
2013, Web Services |
16
|
2013, RoleTailored Client |
25
|
2009, Classic Client, Native |
644
|
2009, Web Services |
8,081
|
2009, RoleTailored Client |
8,133
|
2009, Classic Client, SQL |
8,637
|
Graphically, this is how it looks:
Image 2: Repeated Read of Customers, Vendors and Items
NAV 2013 is lightning fast here, and no wonder why: the caching. While NAV 2009 on SQL Server had to maintain a series of cursors, NAV 2013 ran a single T-SQL query, and then cached the records for subsequent reads. It simply outperforms everything.I don’t want to spend any time comparing the speed of NAV 2013 with the speed of NAV 2009 native; what I want to do is point out the speed improvement by a factor of more than 400x over NAV 2009 on SQL. How cool is that?
3. Repeated Read of Filtered Tables
The beauty of this test is that it shows how well a system copes with a complex filter. I’ve set the filter on Name and Description columns on Customer, Vendor and Item table respectively to this: @*a* (it searches for letter a anywhere in the field, in a case-insensitive way).
This filter can’t make meaningful use of any key, so what shall win or lose this race will be the capability of the database management system to handle such a process on foot.
Again, NAV 2013 played this one coolly.
Here go the results:
2013, Web Services |
20
|
2013, RoleTailored Client |
24
|
2009, Classic Client, Native |
515
|
2009, Web Services |
5,720
|
2009, RoleTailored Client |
5,741
|
2009, Classic Client, SQL |
6,178
|
This is the graph:
Image 3: Repeated Read of Filtered Tables
While the variances in NAV 2009 SQL Server flavors are insignificant, the improvement of NAV 2013 is again verging with insane. It’s obvious that the cache kicked in here bit time, but I also assume that there may be some .NET-level code optimization that made this kind of thing possible.4. Reading Unique G/L Account Numbers from G/L Entry
Now, this was a tricky one. It uses several concepts, combination of which is a total no-brainer for the ISAM-based NAV 2009 on native, but verges on rocket science for anything SQL-related. It was literally the most inefficient thing to do to a SQL database in NAV, and running a piece of code such as this literally smothers SQL by causing it to drop existing and create new cursors all the time.
The algorithm is as follows:
- Set the key on G/L Account column
- Find the first G/L Entry row
- Set a filter on the G/L Account column to that G/L Account which is currently selected
- Find the last G/L Entry with this filter applied
- Remove the filter on the G/L Account column
- Repeat 3 to 6 until there are more G/L Entry rows
And this is exactly what the results show:
2009, Classic Client, Native |
478
|
2013, Web Services |
1,129
|
2013, RoleTailored Client |
1,142
|
2009, Web Services |
11,029
|
2009, RoleTailored Client |
11,113
|
2009, Classic Client, SQL |
11,555
|
And then, the picture:
Image 4: Repeated Read of SIFT-Filtered Tables
Now, before jumping out from your seat and shouting “gotcha!”, think of this test once again. NAV 2013 is almost 10x faster than NAV 2009 here, and whatever it did deep there in its engine is close to a miracle. If it did caching to attain this speed, that caching must be pretty smart, because this piece of code was accessing some very small sets and jumping around the records like crazy.While I have a very plausible explanation what made NAV 2013 win all previous tests, I don’t have a faintest idea what kind of magic made it perform this well here. Yes, it is slower than native, but this was kind of like making a Formula One compete in a rally.
Catch this: native is fully optimized to do this kind of access, and doing this is no smarter for it than doing the simplest kind of data iteration. As a matter of fact, since there were less rows to read, this one should have been faster than the repeated read test. And it was. At the same time, NAV 2009 on SQL was slower here, because this put much more pressure on it, and it had to struggle. And struggle it did.
Yet, NAV 2013, while still struggling, has shown an incredible performance improvement to make even this kind of thing perform well. Quite a job, Microsoft!
5. SIFT Read
I measured how various systems perform with SIFTs, in a scenario quite common in real life: iterating through a set of data and calculating flow fields for each row. NAV does this in many situations, and I was very curious to find out how fast NAV 2013 would be here, because of the many changes Microsoft has done in handling the flow fields in NAV 2013.
Here are the results:
2013, RoleTailored Client |
1,517
|
2013, Web Services |
1,518
|
2009, Classic Client, Native |
1,638
|
2009, Web Services |
9,500
|
2009, RoleTailored Client |
9,552
|
2009, Classic Client, SQL |
9,745
|
Or graphically:
Image 5: SIFT Read
When handling flow fields, NAV 2013 performs slightly better than native ever did, about 8% faster. This is quite a feat, if you have in mind that native handles this functionality again, well, natively, by building the flow field information right into indexes, something that SQL never could.Okay, I assume that some serious caching took place here as well, but still, caching or not, the whole system performs better and faster in NAV 2013. Compared to SQL Server flavors in NAV 2009, the improvement of 532% is quite amazing, and even more so if you think that probably everybody thought that Microsoft has hit the limit with replacing SIFT tables with indexed views in 5.0 SP1. With that obviously not having been a limit at all, I now wonder shall we experience even more improvement here in the future?
5a. SETAUTOCALCFIELDS
Finally, I ran the same test as the previous one, with the SETAUTOCALCFIELDS. I expected serious improvement, but at average of 1,466 milliseconds, this test performed practically only insignificantly faster than the previous one. I expected this one to show the real improvement over the traditional CALCFIELDS approach, but it stubbornly declined. I can’t explain this, but hey, let’s not get too picky
Overall Results
When you add all of the figures above together, the cumulative results demonstrate that Microsoft Dynamics NAV 2013 outperforms its previous incarnations, including the so-far unbeatable Classic Client on a native database.
On average, this is what it took the three clients to execute all tests:
2009 SQL |
48,636
|
2009 Native |
10,743
|
2013 |
8,374
|
And the last picture of the day:
Image 6: Overall Results
Obviously, the improvements that NAV 2013 promises are not just plain words, as these test results show. The overall performance is about 28% better than with the NAV 2009 Classic Client on a native database, and about 480% better (that’s almost 6x performance improvement!) than with NAV 2009 under SQL Server.However, this is only a part of the story. There is another one: concurrency. Performance is always welcome, but performance is not what has been preventing NAV to scale as much as, for example, AX could. I wonder if Microsoft will release a hardware sizing document that would estimate some kind of the upper limit for vertical scalability of NAV 2013. The last time we got such numbers from Microsoft was with version 5.0 SP1, when it was set at 250 concurrent users.
Of course, any estimates of the kind are comparing apples to oranges, anyway, because at that number of users, the application is probably always heavily customized, and the actual upper vertical scalability limit will invariably depend on a very complex set of parameters, and can be determined only on a case-by-case basis.
I would’ve loved to have done concurrency tests together with the performance tests, but I may do that another time. However, based on the figures I see here, I dare estimating that everything else being equal, concurrency levels can at least be doubled in any given NAV 2013 deployment, over an equal NAV 2009 deployment.
But What About The Other Test?
So, why do Per Mogensen’s test show somewhat different results? On the C/AL level, hist test is very consistent with my measurements with Web services, but in Per’s tests, NAV 2013 performance with RTC is still inferior to all other clients and platforms.
I can’t tell for sure, but I’ll give my best guess:
- Virtualization: The systems were comparable, but the tests were run under different virtual machines, and the virtual hypervisor in charge might have redistributed resources, or other virtual machines were doing some cleanups while NAV 2013 was running, or a whole range of other things might have happened.
- Hardware: The RTC is a .NET application, and depends a lot on hardware on a machine to execute all the things that .NET applications do: just-in-time compilation during warm-up, and talking to video drivers at run time. Since it was a virtual box, maybe the virtual hardware causes troubles with .NET applications talking to it, while it performs better when Win32 applications (as the Classic client) are talking to it.
- Warm-up: While it certainly should be enough to run 100 sales orders through the create-release-ship-invoice cycle to warm a system up, I still think that a thorough warm-up is required for any kind of benchmarking. The warm-up time itself should be disregarded as it is no measure of either real performance under pressure, or the scalability. To determine if the system is properly warmed up, you need to keep running warm-ups until you see no significant performance between the two runs. Only then you can start measuring. The minimum number of runs to determine this condition is three runs.
So, who do you trust, Per or me? Neither one! Please, don’t just take my findings for granted. Do the measurements yourself.
Here, I’ve attached the objects that I’ve used to run the benchmark, so you can run the same tests on your own machine, and see your own results. I am really curious about the results you’ll get.
So, download the objects:
The reason why there are three distinct sets of objects is that NAV 2013 uses .NET Interoperability in addition to system time to measure time, and that native doesn’t use role centers. Everything else is exactly the same.
(Just in case you need it, here is also my Excel sheet with testing results and charts.)
Run the tests, and then come back here and share your findings. I’d love to hear from you!
NAV 2013 fiyatlandırmasında basitleşmeye gidiyor
Simplified Pricing of Microsoft Dynamics GP 2013 and NAV 2013 for SMBs Announced at WPC 2012
Simplicity was the word of the day at WPC 2012 for Microsoft Dynamics GP and NAV partners learning about the new perpetual licensing scheme that will take effect for on-premise deployments and upgrades to NAV 2013 and GP 2013. The changes will result in far fewer individual line items in the price list, with the aim of making it easier for partners to generate quotes and customers to understand them.
The goal of the changes are to drive volume, optimize licensing for SMBs, and to simplify the approach for perpetual license sales (subscription pricing will be a different issue altogether, Microsoft says). The changes will reduce prices for many customers, especially those with less than 100 users, though larger implementations and some others could see license costs increase, which, Microsoft told the partner audience, is by design.
"We wanted it to be so simple you could figure out pricing on the back of a napkin," explained Carl Radecke, worldwide pricing lead for Microsoft to a full house of several hundred Dynamics ERP professionals. To drive the point home, the audience received cocktail napkins imprinted with the new prices.
Simplicity was the word of the day at WPC 2012 for Microsoft Dynamics GP and NAV partners learning about the new perpetual licensing scheme that will take effect for on-premise deployments and upgrades to NAV 2013 and GP 2013. The changes will result in far fewer individual line items in the price list, with the aim of making it easier for partners to generate quotes and customers to understand them.
The goal of the changes are to drive volume, optimize licensing for SMBs, and to simplify the approach for perpetual license sales (subscription pricing will be a different issue altogether, Microsoft says). The changes will reduce prices for many customers, especially those with less than 100 users, though larger implementations and some others could see license costs increase, which, Microsoft told the partner audience, is by design.
"We wanted it to be so simple you could figure out pricing on the back of a napkin," explained Carl Radecke, worldwide pricing lead for Microsoft to a full house of several hundred Dynamics ERP professionals. To drive the point home, the audience received cocktail napkins imprinted with the new prices.
12 Temmuz 2012 Perşembe
The Top New Features of Dynamics NAV 2013
Microsoft have been getting a great amount of media coverage recently, with announcements about Windows 8, Windows Phone 8, Surface, Office 2013. In addition to this, outside of the mainstream media and within the business circles, they have announced new versions of Windows Server, SQL Server, and Dynamics NAV (formally Navision).
It’s Dynamics NAV 2013 that I want to focus on in this post. This is the software release that has got me the most excited, most probably because my day to day work involves me delivering solutions based on Dynamics NAV. In fact if you speak to most people in the Dynamics world, they are all getting excited about Dynamics NAV 2013. The main reason is due to the shift in the technologies that work hand in hand with NAV 2013. Some of these technologies were there in NAV 2009 in one shape or another, but that was just a stepping stone towards NAV 2013.
For this post I’m going to highlight what I feel are the top 4 new features (some may be classed as improvements on existing features) in Dynamics NAV 2013.
Firstly the SharePoint client allows you to add a NAV page as a SharePoint part. It’s not widely known, but you could do this in NAV 2009. It just didn’t work very well and required you to jump through several hoops to get it to run. Now though, it just works. No extra installs or complicated configuration needed. Just add a new part to the required page in SharePoint and it all just works great. What’s even slicker is that when you click on something in the part, a Microsoft Dynamics NAV ribbon appears at the top of the page in the SharePoint ribbon.
Additionally, you can create a SharePoint page that contains various different parts. Each part might be a NAV page, or alternatively you could mix and match NAV parts with Dynamics CRM parts and then use SharePoint connectors to connect the different parts so that clicking on one part, updates the other. For example, a customer list part from CRM might update a sales invoice list part from NAV, with the sales invoice list being filtered by whichever customer is selected in the CRM list part.
You can create a true dashboard/cockpit view on a page within SharePoint. Other uses might be more simplistic, for such “light user” activities like posting a timesheet (timesheets are another new feature of NAV 2013) or looking up and updating supplier details.
Moving onto the new NAV web client, the same can be said for this as the SharePoint client. It was all possible with NAV 2009, but required a third party product in order to do it, or required extensive development of custom .NET web pages that hooked into web service enabled pages. With the Web Client though, it all comes out of the box, and just works. I’ve been playing with the beta a fair bit and it really is great. I won’t speculate on what MSFT’ plans are on the 2013 pricing model, but I really expect this to change the landscape of how NAV is sold. It’s pretty old news now that NAV will be hostable on Microsoft’s Azure platform, and don’t be too surprised if we see a similar model to CRM Online that allows you to purchase NAV Online direct from Microsoft.
Almost everything is there in the Web Client that is in the Windows Client (you can’t call it Classic client and Role Tailored Client any more since the Classic client doesn’t exist). I think a couple of the misconceptions about the web client were that matrix type screens wouldn’t be able to run (they look the same as in the windows client) and reports wouldn’t run (which they do). So what’s missing? So far, the things I’ve noticed are:
But that’s not all. One of the biggest frustrations for users going from the older Classic client to the RTC was the lack of ability to filter on sub-form lines, for example lines on a purchase order. And then finding the functions from the little lightning bolt icon was a pain too, especially when it vanished and you are trying to explain to a customer over the phone to click on the lightning bolt that they can’t see! The vanishing lightning bolt is kind of still there, in the form of a vanishing settings cog icon, but they have laid the line actions in such an easier way and added the ability to search and filter on the lines. So much better.
The way dimensions are held against transactions have completely changed too. Previously, viewing and filtering on any dimension that isn’t one of the two global dimensions was extremely difficult without a modification or additional report (with the help of analysis view entries). This has all changed in NAV 2013. As put by the NAV team:
On top of this, independent tests from the people in the NAV community have noted that speed of use in general in NAV 2013 is much faster that NAV 2009 RTC, which is always welcoming to hear.
In fact, NAV 2013 has gone chart crazy. You are able to pretty much convert just about any page in the system into a chart.
So, that was more than four really if you were counting, but these are just some of the things I’m looking forward to in NAV 2013.
http://dotte.ch/the-top-new-features-of-dynamics-nav-2013/?goback=%2Egde_4259732_member_133010052
For this post I’m going to highlight what I feel are the top 4 new features (some may be classed as improvements on existing features) in Dynamics NAV 2013.
SharePoint and Web Client
I’ll call this one out as one new feature, because they do the same thing, but really it’s two.Firstly the SharePoint client allows you to add a NAV page as a SharePoint part. It’s not widely known, but you could do this in NAV 2009. It just didn’t work very well and required you to jump through several hoops to get it to run. Now though, it just works. No extra installs or complicated configuration needed. Just add a new part to the required page in SharePoint and it all just works great. What’s even slicker is that when you click on something in the part, a Microsoft Dynamics NAV ribbon appears at the top of the page in the SharePoint ribbon.
Additionally, you can create a SharePoint page that contains various different parts. Each part might be a NAV page, or alternatively you could mix and match NAV parts with Dynamics CRM parts and then use SharePoint connectors to connect the different parts so that clicking on one part, updates the other. For example, a customer list part from CRM might update a sales invoice list part from NAV, with the sales invoice list being filtered by whichever customer is selected in the CRM list part.
You can create a true dashboard/cockpit view on a page within SharePoint. Other uses might be more simplistic, for such “light user” activities like posting a timesheet (timesheets are another new feature of NAV 2013) or looking up and updating supplier details.
Moving onto the new NAV web client, the same can be said for this as the SharePoint client. It was all possible with NAV 2009, but required a third party product in order to do it, or required extensive development of custom .NET web pages that hooked into web service enabled pages. With the Web Client though, it all comes out of the box, and just works. I’ve been playing with the beta a fair bit and it really is great. I won’t speculate on what MSFT’ plans are on the 2013 pricing model, but I really expect this to change the landscape of how NAV is sold. It’s pretty old news now that NAV will be hostable on Microsoft’s Azure platform, and don’t be too surprised if we see a similar model to CRM Online that allows you to purchase NAV Online direct from Microsoft.
Almost everything is there in the Web Client that is in the Windows Client (you can’t call it Classic client and Role Tailored Client any more since the Classic client doesn’t exist). I think a couple of the misconceptions about the web client were that matrix type screens wouldn’t be able to run (they look the same as in the windows client) and reports wouldn’t run (which they do). So what’s missing? So far, the things I’ve noticed are:
- No search box (something that I’ve seen most RTC users depending on)
- No Departments menu (this really backs up the fact that the Web Client isn’t for “power users”), you literally have access to what your role dictates.
- No charts in the role centre.
- No MS Office integration eg. No Outlook part of send to Excel.
- No customisation/personalisation (This was probably one of the biggest selling points of the RTC)
Office Ribbon
The introduction of the Office 2010 ribbon in NAV 2013 is a hugely welcome addition. When the first iteration of the ribbon was introduced back in Office 2007, there was much confusion and grumbling from the Office using community. Myself included. People generally don’t like change. We all know it. By the time Office 2010 came along, I really felt that Microsoft had actually taken some of communities criticisms on board though as it was a real improvement. I occasionally have to revert back to older versions of Office when on customer sites who haven’t made the upgrade from Office 2003 yet, and it is really painful using those products. If I’m writing a report in Excel for example, it takes me almost twice as long to do certain things in the older (non-ribbon) version compared to Office 2010. Anyway, I digress… The ribbon makes so much more sense when navigating around the current page in NAV 2013. Everything is a lot easier to find and it’s all still customisable for the user.But that’s not all. One of the biggest frustrations for users going from the older Classic client to the RTC was the lack of ability to filter on sub-form lines, for example lines on a purchase order. And then finding the functions from the little lightning bolt icon was a pain too, especially when it vanished and you are trying to explain to a customer over the phone to click on the lightning bolt that they can’t see! The vanishing lightning bolt is kind of still there, in the form of a vanishing settings cog icon, but they have laid the line actions in such an easier way and added the ability to search and filter on the lines. So much better.
Table Locking Architecture and Improved Dimension Handling Redesign
Big changes have been made to speed up the general ledger posting process in NAV 2013, which will enable more users to post orders at the same time. This has always been a stumbling block of NAV and has historically limited the size of organisations that NAV could be implemented into. This should now be a thing of the past. In worst case scenarios NAV 2013 can be set to delay the posting of large numbers of transactions to out of hours and not to hinder the day to day use of the system.The way dimensions are held against transactions have completely changed too. Previously, viewing and filtering on any dimension that isn’t one of the two global dimensions was extremely difficult without a modification or additional report (with the help of analysis view entries). This has all changed in NAV 2013. As put by the NAV team:
In Microsoft Dynamics NAV 2013, the dimensions functionality has been heavily redesigned. Instead of storing all individual dimension values for each record in separate tables, each unique combination of dimensions and values gets an ID, and this dimension set ID is stored directly on the record that those values belong to. With this change, we have taken an important step: to store all information about dimensions and their values directly on the record.This will make dimensional reporting extremely easier.
On top of this, independent tests from the people in the NAV community have noted that speed of use in general in NAV 2013 is much faster that NAV 2009 RTC, which is always welcoming to hear.
Charts and Visualisation
Again, this was something that was hinted at in NAV 2009 with the chart parts. They looked flash but they just weren’t all that useful. With the new charts though, you can hover your mouse over the parts and they will show you additional information, such as actual values. In some cases (depending on how the chart has been designed) you can also click through on the charts and they will show you the underlying data. You can also add parts that are basically made up from Account Schedules, which again allow you not only to click through, but to alter the periods you are looking at, change the period lengths, and force a refresh (meaning you don’t have to come out of the role centre and back in).In fact, NAV 2013 has gone chart crazy. You are able to pretty much convert just about any page in the system into a chart.
So, that was more than four really if you were counting, but these are just some of the things I’m looking forward to in NAV 2013.
http://dotte.ch/the-top-new-features-of-dynamics-nav-2013/?goback=%2Egde_4259732_member_133010052
Etiketler:
Microsoft Dynamics Navision,
NAV,
NAV 2013,
NAV 7
11 Temmuz 2012 Çarşamba
Simulation of Average Cost Calculation on Dynamics NAV
When using Average Costing method it’s sometimes difficult to interpret the assigned costs of an outbound entry.
In the scenario below there is process description for simulating the average cost calculation. This is often used when investigating costing issues related to the average costing method. It has been helpful in verifying the recognized COGS, to describe the average cost calculation or using it as identification that somewhere in time the average cost is unexpected. It helps to identify the area for deeper research of the records demonstrating unexpected values.
This blog post describes the process for how the average cost calculation can be simulated only and does not describe possible causes or possible correction processes.
The following scenario is carried out to create the basic data set. The data set is then used as base for processing and analyzing the average cost calculation when having the setup Average Cost Period as Day and Month respectively.
The scenario is carried out in a W1 Cronus database.
The first two steps create the basic data set which later on is used in respective simulation of average cost calculation.
1. Create Item: TEST
If you are aiming for creating the full scenario and working through the respective simulations of Average cost calculation for Day and Month, it’s a good thing to create a backup now.
Inventory setup, Average Cost Calc Period = Day
3. Run the Adjust Cost - Item Entries batch job.
4. Filter the Item Ledger Entry table on Item TEST and review the fields specified below.
5. Open Revaluation Journal
6. Run Function Calculate Inventory Value:
7. Change Unit Cost (Revalued) to 12 as above.
8. Post Line.
9. Run Adjust Cost - Item Entries batch job.
10. Filtering the Value Entries table on Item TEST, the following records are available:
Simulation of Average Cost Calculation with Average Cost period = Day
Now we are moving into the process of simulating the Average cost Calculation when Average cost period is Day, using the data for item TEST created in the scenario above.
When you have identified the Item that you need to further analyze the following process can be used. Below the described process, there is a screenshot showing the results of the simulation of the Average Cost calculation of item TEST.
In addition there is an Excel sheet attached where the full data set is available where used formulas, etc. can be reviewed more closely.
1. In the Value Entries table, filter on the particular Item that is to be analyzed.
If Average Cost Calc. Type is per Item&Location&Variant the filter has to cover also these fields with particular values in scope for the analysis.
2. Paster filtered Value entries into Excel.
3. Do a Custom Sorting using the fields as below:
Comments to respective field being a part of the sorting:
Valuation Date: is the date for when the entry shall be part of the Average cost calculation.
Partial Revaluation: a field that states Yes on Value entries with Type Revaluation. Revaluations affect the valuation of the following period’s outbound entries, not the outbound entries of the same period.
Valued quantity: is populated on every Value entry, corresponds to Item ledger entry quantity, Invoiced Quantity or Revalued quantity. Largest to smallest brings the inbound entries to come before the outbound entries of the period and thereby create the base for calculating the average cost of the period.
Item Ledger entry No.: Is to group the value entries attached to same Item ledger entry no.
4. Insert Summary lines where you want to establish the periods Average Cost (grey lines below). A summary line shall be inserted above the first outbound entry of a period. To identify the breakpoint for inserting the summary line follow these steps:
If you have several Summary lines inserted, make sure to include the previous summary line into the calculation of respective column for the next period.
6. Choose an outbound entry, usually the first outbound entry of the period and then a couple of others, randomly selected in the period or those that for some reasons is of particular interest, and calculate the average cost per unit with the formula above (green, purple and blue sections in screenshot below).
- Does it correspond to the average unit cost of the period?
If not, ensure it is not fixed applied to an inbound entry: If field Valued By Average Cost is False, it is fixed applied to an inbound entry.
To which entry?; Follow up on the parent Item Ledger entry, field Applies-to Entry shall carry the entry no. of the supplying Item Ledger Entry.
If not fixed applied; establish the Amount Rounding precision and investigate if that has an effect on the Calculated Average cost.
These are the Value entries for item TEST when they have been sorted as described in step 3, where Summary lines has been inserted to establish Average cost for a certain period (step 4, 5) and where the first outbound entry of the period is calculated (step 6).
In attached spreadsheet used formulas can be checked by clicking in respective field.
Inventory setup, Average Cost Calc Period = Month
Another Average Cost Calc Period to use is Month, so let’s work with the basic scenario, create some additional data and see the effects having Month as Average Cost Calc Period and finally look into the simulation of the Average Cost calculation and its specifics.
The scenario continues using the basic data set created until step 2. If you did a backup after step 2 and have been working with the Average Cost Calc Period of Day you now have the opportunity to restore the backup and you will be able to start with step 3 below.
3. Change Inventory setup; Average Cost Calc Period to Month.
4. Run Adjust Cost - Item entries batch job.
5. Filter the Item Ledger Entry table, Item TEST, and review the fields specified below.
6. Open Revaluation Journal
7. Run Function Calculate Inventory Value:
8. Change Unit Cost (Revalued) to 12 as above.
9. Post Line.
10. Run Adjust Cost - Item Entries batch job.
11. Filtering the Value Entries table, Item TEST, the following records are available:
Simulation of Average Cost Calculation with Average Cost period = Month
Now we are moving into the process of simulating the Average cost Calculation when Average Cost period is Month, using the data for item TEST created in the scenario.
When you have identified the Item that you need to further analyze the following process can be used. Below the described process, there is a screenshot showing the result of the simulation of the Average Cost calculation of item TEST using the Value entries created in the scenario. In addition there is an Excel sheet attached where the full data set is available where used formulas etc can be reviewed more closely.
1. In the Value Entries table filter on the particular Item that is to be analyzed.
If Average Cost Calc. Type is per Item&Location&Variant the filter has to cover also these fields with particular values in scope for the analysis.
2. Paste filtered Value entries into Excel.
3. Conversion of Valuation Date into Period:
Having another Average Cost Calc Period than Day requires the Valuation date to be translated into the chosen Average Cost Calc Period. In this case it’s Month.
In the screenshot below and in the attached Excel sheet the mentioned columns can be found.
Comments to respective field being a part of the sorting:
Year and Period No.: is the time for when the entry shall be part of the Average cost calculation.
Partial Revaluation: a field that states Yes on Value entries with Type Revaluation. Revaluations affect the valuation of the following period’s outbound entries, not the outbound entries of the same period
Valued quantity: is populated on every Value entry, corresponds to Item ledger entry quantity, Invoiced Quantity or Revalued quantity. Largest to smallest brings the inbound entries to come before the outbound entries of the period and thereby create the base for calculating the average cost of the period.
Item Ledger entry No.: Is to group the value entries attached to same Item ledger entry no.
5. Insert Summary lines where you want to establish the periods Average Cost (grey lines below). A summary line shall be inserted above the first outbound entry of a period. To identify the breakpoint for inserting the summary line follow these steps:
If you have several Summary lines inserted, make sure to include the previous summary line into the calculation of respective column for the next period.
7. Choose an outbound entry, usually the first outbound entry of the period and then a couple of others, randomly selected in the period or those that for some reasons is of particular interest, and calculate the average cost per unit with the formula above (green, purple and blue sections in screenshot below).
- Does it correspond to the average unit cost of the period?
If not, ensure it is not fixed applied to an inbound entry: If field Valued By Average Cost is False, it is fixed applied to an inbound entry.
To which entry?; Follow up on the parent Item Ledger entry, field Applies-to Entry shall carry the entry no. of the supplying Item Ledger Entry.
If not fixed applied; establish the Amount Rounding precision and if that has an effect on the Calculated Average cost.
These are the Value entries for item TEST when they have been sorted as described in step 4, where Summary lines has been inserted to establish Average cost for a certain period (step 5,6) and where the first outbound entry of the period (+ the 2nd in period 10) is calculated (step 7).
Note that all inbound entries in September (Period No. 9) is sorted at the top and demonstrate the effect on all outbound entries in September regardless of the specific valuation date.
To follow the process and be able to review used formulas etc., an Excel sheet is attached and contains the following tabs:
Basic Data
---------------
Contains the scenario and what data it creates. Thereafter the basic scenario moves into two paths, one for using Day as Average Cost period and the other for using Month as Average Cost Period. The respective set of Value entries are thereafter pasted into the next tabs.
Average Cost simulation - Day
----------------------------------------
At the top the Value entries are pasted from the Basic Data scenario addressing the Average cost period of Day.
The Value entries are processed; sorted and calculated as described beneath the section of value entries.
Average Cost simulation - Month
-------------------------------------------
At the top the Value entries are pasted from the Basic Data scenario addressing the Average cost period of Month.
The Value entries are processed; sorted and calculated as described beneath the section of value entries.
Any feedback to how this process and documentation can be further developed to provide more insight in the average cost calculation is very welcome.
Helene Holmin
Escalation Engineer NAV Costing EMEA
hholmin@microsoft.com
In the scenario below there is process description for simulating the average cost calculation. This is often used when investigating costing issues related to the average costing method. It has been helpful in verifying the recognized COGS, to describe the average cost calculation or using it as identification that somewhere in time the average cost is unexpected. It helps to identify the area for deeper research of the records demonstrating unexpected values.
This blog post describes the process for how the average cost calculation can be simulated only and does not describe possible causes or possible correction processes.
The following scenario is carried out to create the basic data set. The data set is then used as base for processing and analyzing the average cost calculation when having the setup Average Cost Period as Day and Month respectively.
The scenario is carried out in a W1 Cronus database.
The first two steps create the basic data set which later on is used in respective simulation of average cost calculation.
1. Create Item: TEST
Average Costing method
Unit Cost: 10
2. Create and post the following documents.If you are aiming for creating the full scenario and working through the respective simulations of Average cost calculation for Day and Month, it’s a good thing to create a backup now.
Inventory setup, Average Cost Calc Period = Day
3. Run the Adjust Cost - Item Entries batch job.
4. Filter the Item Ledger Entry table on Item TEST and review the fields specified below.
5. Open Revaluation Journal
6. Run Function Calculate Inventory Value:
Filter Item: TEST
Posting Date: September 15, 2011
Per Item
7. Change Unit Cost (Revalued) to 12 as above.
8. Post Line.
9. Run Adjust Cost - Item Entries batch job.
10. Filtering the Value Entries table on Item TEST, the following records are available:
Simulation of Average Cost Calculation with Average Cost period = Day
Now we are moving into the process of simulating the Average cost Calculation when Average cost period is Day, using the data for item TEST created in the scenario above.
When you have identified the Item that you need to further analyze the following process can be used. Below the described process, there is a screenshot showing the results of the simulation of the Average Cost calculation of item TEST.
In addition there is an Excel sheet attached where the full data set is available where used formulas, etc. can be reviewed more closely.
1. In the Value Entries table, filter on the particular Item that is to be analyzed.
If Average Cost Calc. Type is per Item&Location&Variant the filter has to cover also these fields with particular values in scope for the analysis.
2. Paster filtered Value entries into Excel.
3. Do a Custom Sorting using the fields as below:
Comments to respective field being a part of the sorting:
Valuation Date: is the date for when the entry shall be part of the Average cost calculation.
Partial Revaluation: a field that states Yes on Value entries with Type Revaluation. Revaluations affect the valuation of the following period’s outbound entries, not the outbound entries of the same period.
Valued quantity: is populated on every Value entry, corresponds to Item ledger entry quantity, Invoiced Quantity or Revalued quantity. Largest to smallest brings the inbound entries to come before the outbound entries of the period and thereby create the base for calculating the average cost of the period.
Item Ledger entry No.: Is to group the value entries attached to same Item ledger entry no.
4. Insert Summary lines where you want to establish the periods Average Cost (grey lines below). A summary line shall be inserted above the first outbound entry of a period. To identify the breakpoint for inserting the summary line follow these steps:
a. Establish the Valuation Date to be in scope for the investigation and locate these entries in the sorted Value entry list.
b. Then follow the stated quantities in field Valued Quantity for the chosen Valuation date. Identify the first line with negative quantity and you have the first outbound entry of the period.
c. Insert a line for calculation, above the first outbound entry of the period (example in the screenshot below and in attached spreadsheet; column M row 3 and 5, row 3 positive Valued Quantity, row 5 negative Valued Quantity, Summary line is inserted, row 4.
5. Make a Sum of the columns; Cost Amount (Actual), Cost Amount (Expected) and Item Ledger Entry Quantity. Calculate the Average Unit Cost of the period (column R) with the following formula:If you have several Summary lines inserted, make sure to include the previous summary line into the calculation of respective column for the next period.
6. Choose an outbound entry, usually the first outbound entry of the period and then a couple of others, randomly selected in the period or those that for some reasons is of particular interest, and calculate the average cost per unit with the formula above (green, purple and blue sections in screenshot below).
- Does it correspond to the average unit cost of the period?
If not, ensure it is not fixed applied to an inbound entry: If field Valued By Average Cost is False, it is fixed applied to an inbound entry.
To which entry?; Follow up on the parent Item Ledger entry, field Applies-to Entry shall carry the entry no. of the supplying Item Ledger Entry.
If not fixed applied; establish the Amount Rounding precision and investigate if that has an effect on the Calculated Average cost.
These are the Value entries for item TEST when they have been sorted as described in step 3, where Summary lines has been inserted to establish Average cost for a certain period (step 4, 5) and where the first outbound entry of the period is calculated (step 6).
In attached spreadsheet used formulas can be checked by clicking in respective field.
Inventory setup, Average Cost Calc Period = Month
Another Average Cost Calc Period to use is Month, so let’s work with the basic scenario, create some additional data and see the effects having Month as Average Cost Calc Period and finally look into the simulation of the Average Cost calculation and its specifics.
The scenario continues using the basic data set created until step 2. If you did a backup after step 2 and have been working with the Average Cost Calc Period of Day you now have the opportunity to restore the backup and you will be able to start with step 3 below.
3. Change Inventory setup; Average Cost Calc Period to Month.
4. Run Adjust Cost - Item entries batch job.
5. Filter the Item Ledger Entry table, Item TEST, and review the fields specified below.
6. Open Revaluation Journal
7. Run Function Calculate Inventory Value:
Filter Item: TEST
Posting Date: September 30, 2011
Per Item
8. Change Unit Cost (Revalued) to 12 as above.
9. Post Line.
10. Run Adjust Cost - Item Entries batch job.
11. Filtering the Value Entries table, Item TEST, the following records are available:
Simulation of Average Cost Calculation with Average Cost period = Month
Now we are moving into the process of simulating the Average cost Calculation when Average Cost period is Month, using the data for item TEST created in the scenario.
When you have identified the Item that you need to further analyze the following process can be used. Below the described process, there is a screenshot showing the result of the simulation of the Average Cost calculation of item TEST using the Value entries created in the scenario. In addition there is an Excel sheet attached where the full data set is available where used formulas etc can be reviewed more closely.
1. In the Value Entries table filter on the particular Item that is to be analyzed.
If Average Cost Calc. Type is per Item&Location&Variant the filter has to cover also these fields with particular values in scope for the analysis.
2. Paste filtered Value entries into Excel.
3. Conversion of Valuation Date into Period:
Having another Average Cost Calc Period than Day requires the Valuation date to be translated into the chosen Average Cost Calc Period. In this case it’s Month.
In the screenshot below and in the attached Excel sheet the mentioned columns can be found.
a. Column F is added: The Valuation Date column is copied into column F. Thereafter column F is selected and the Format is changed to Number, no decimals. The Valuation Date is now converted to a number in column F.
b. Column G is added and is intended to carry the Year of the Valuation Date:
Select column G and change Format to Number, no decimals.
Add formula: =YEAR(F2) in cell G2, then double click on the plus sign in the right corner of the cell and the Year is generated for the rest of the lines.
Select column G and change Format to Number, no decimals.
Add formula: =YEAR(F2) in cell G2, then double click on the plus sign in the right corner of the cell and the Year is generated for the rest of the lines.
c. Column H is added and is intended to carry the Period No. of the Valuation Date:
Select column H and change Format to Number, no decimals.
Add formula: =MONTH(F2) in cell H2, then double click on the plus in the right corner of the cell and the Month is generated for the rest of the lines.
4. Do a Custom Sorting using the fields as below:Select column H and change Format to Number, no decimals.
Add formula: =MONTH(F2) in cell H2, then double click on the plus in the right corner of the cell and the Month is generated for the rest of the lines.
Comments to respective field being a part of the sorting:
Year and Period No.: is the time for when the entry shall be part of the Average cost calculation.
Partial Revaluation: a field that states Yes on Value entries with Type Revaluation. Revaluations affect the valuation of the following period’s outbound entries, not the outbound entries of the same period
Valued quantity: is populated on every Value entry, corresponds to Item ledger entry quantity, Invoiced Quantity or Revalued quantity. Largest to smallest brings the inbound entries to come before the outbound entries of the period and thereby create the base for calculating the average cost of the period.
Item Ledger entry No.: Is to group the value entries attached to same Item ledger entry no.
5. Insert Summary lines where you want to establish the periods Average Cost (grey lines below). A summary line shall be inserted above the first outbound entry of a period. To identify the breakpoint for inserting the summary line follow these steps:
a. Establish the time period to be in scope for the investigation and locate these entries in the sorted Value entry list.
b. Then follow the stated quantities in field Valued Quantity for the chosen time period.
Identify the first line with negative quantity and you have the first outbound entry of the period.
Identify the first line with negative quantity and you have the first outbound entry of the period.
c. Insert a line for calculations. (Column P, row 4 and 6, row 4 positive Valued Quantity, row 6 negative Valued Quantity, Summary line is inserted as row 4).
6. Make a Sum of the columns; Cost Amount (Actual), Cost Amount (Expected) and Item Ledger Entry Quantity. Calculate the Average Unit Cost of the period (column U) with the following formula:If you have several Summary lines inserted, make sure to include the previous summary line into the calculation of respective column for the next period.
7. Choose an outbound entry, usually the first outbound entry of the period and then a couple of others, randomly selected in the period or those that for some reasons is of particular interest, and calculate the average cost per unit with the formula above (green, purple and blue sections in screenshot below).
- Does it correspond to the average unit cost of the period?
If not, ensure it is not fixed applied to an inbound entry: If field Valued By Average Cost is False, it is fixed applied to an inbound entry.
To which entry?; Follow up on the parent Item Ledger entry, field Applies-to Entry shall carry the entry no. of the supplying Item Ledger Entry.
If not fixed applied; establish the Amount Rounding precision and if that has an effect on the Calculated Average cost.
These are the Value entries for item TEST when they have been sorted as described in step 4, where Summary lines has been inserted to establish Average cost for a certain period (step 5,6) and where the first outbound entry of the period (+ the 2nd in period 10) is calculated (step 7).
Note that all inbound entries in September (Period No. 9) is sorted at the top and demonstrate the effect on all outbound entries in September regardless of the specific valuation date.
To follow the process and be able to review used formulas etc., an Excel sheet is attached and contains the following tabs:
Basic Data
---------------
Contains the scenario and what data it creates. Thereafter the basic scenario moves into two paths, one for using Day as Average Cost period and the other for using Month as Average Cost Period. The respective set of Value entries are thereafter pasted into the next tabs.
Average Cost simulation - Day
----------------------------------------
At the top the Value entries are pasted from the Basic Data scenario addressing the Average cost period of Day.
The Value entries are processed; sorted and calculated as described beneath the section of value entries.
Average Cost simulation - Month
-------------------------------------------
At the top the Value entries are pasted from the Basic Data scenario addressing the Average cost period of Month.
The Value entries are processed; sorted and calculated as described beneath the section of value entries.
Any feedback to how this process and documentation can be further developed to provide more insight in the average cost calculation is very welcome.
Helene Holmin
Escalation Engineer NAV Costing EMEA
hholmin@microsoft.com
Attachment: Average Cost Calc Analysis_process.xlsx
Kaydol:
Kayıtlar (Atom)