Edit In Excel – Part 1 (out of 4)

For the last 6-9 months, Microsoft have been showing a demo of how a user could create a filter on a list place and invoke an action called Edit In Excel. This would open Microsoft Excel with the same records, the user selected in the List Place and allow the user to edit values and post modifications back to Microsoft Dynamics NAV, showing how the user would get a runtime error if he was trying to violate validation logic from NAV.

I will divide this post into 4 sections on how to achieve this

  1. Create an Excel Spreadsheet, which reads the entire Customer table and show it in Excel.
  2. Create an Excel Spreadsheet, which reads the entire Customer table, show it and allow the user to modify, delete or add records directly in Excel – and save the changes back to NAV.
  3. Hook this spreadsheet up to the Customer List Place and take the filter defined on the Customer List Place and apply that to Excel.
  4. Make the damn thing loosely coupled – allowing it to be placed on any List Place if you fancy.

I chose to divide it, in order to allow people better to understand the processes and I think that every one of these four steps will take you through a number of new things.

This first post is all about creating a spreadsheet, which reads the entire Customer table and show it in Excel. We could do this from inside NAV (populate an XML document and send it to the Client) but that would put us into a blind alley when going towards post number 2.

So – we are going to use VSTO

What is VSTO?

VSTO (Visual Studio Tools for Office) came out as an add-on to Visual Studio 2005 and in Visual Studio 2008, it is an integrated part of the professional SKU. Basically it allows you to write code inside Excel, Word, Outlook etc. – add toolbars, menu items, subscribe to triggers and do all kinds of things using your favorite .net language – and mine is C#.

VSTO is NOT in Visual Studio Express SKU’s.

I am not going to make this a tutorial on how to use VSTO – but I will just show a couple of basics. When creating a new project in Visual Studio, you have new options:

image

and when you select to create an Excel Template – it actually opens up a spreadsheet inside of Visual Studio.

image

Now Visual Studio might not be your favorite environment for using Excel, but that isn’t the point. If you look to the right you will see a solution and a project with a number of C# files under your .xltx “folder”. These are files, which contains code behind for the different sheets.

Right click on Sheet1.cs and select View Code and you will see:

image

Which kind of indicates where we are going here…

I can write code in the Sheet1_Startup, which connects to our Microsoft Dynamics NAV Web Services and read data into the Spreadsheet – could it really be that simple?

Yes, it really is that simple – but…

… when you see the code beneath in a moment you can see that this is really not scaling – and it really wouldn’t allow us to edit the data – but hey, that is in post number 2 – this was the simple version, let’s continue.

You of course need to add a Web Reference to the Customer page (Page 21 – exposed as Customer) using the following URL (if the NAV is installed as default):

http://localhost:7047/DynamicsNAV/WS/CRONUS_International_Ltd/Page/Customer

and call the reference CustomerRef.

Add the following code to Sheet1_Startup and run your solution.

// Postpone Screen updating
Application.ScreenUpdating = false;

// Initialize the Service
CustomerRef.Customer_Service service = new CustomerSimple.CustomerRef.Customer_Service();
service.UseDefaultCredentials = true;

// Read the customers
CustomerRef.Customer[] customers = service.ReadMultiple(null, null, 0);

// Populate the header line
int row = 1;
this.Cells[row, 1] = “No”;
this.Cells[row, 2] = “Name”;
this.Cells[row, 3] = “Address”;
this.Cells[row, 4] = “Address_2”;
// etc.

// Fill the spreadsheet
foreach (CustomerRef.Customer customer in customers)
{
row++;
this.Cells[row, 1] = customer.No;
this.Cells[row, 2] = customer.Name;
this.Cells[row, 3] = customer.Address;
this.Cells[row, 4] = customer.Address_2;
// etc.
}

// Set formatting for the added cells
Microsoft.Office.Interop.Excel.Range range = this.Range[this.Cells[1, 1], this.Cells[row, 4]];
range.EntireColumn.NumberFormat = “@”;
range.EntireColumn.HorizontalAlignment = Microsoft.Office.Interop.Excel.Constants.xlLeft;
range.EntireColumn.AutoFit();

// Update the Screen
Application.ScreenUpdating = true;

That’s it and that’s that! This should bring up Excel looking like this:

image

Not quite the end-goal, but I hope you get the picture.

As usual – you can download the solution here http://www.freddy.dk/CustomerSimple.zip

So where do we go from here?

In the next post we will start by removing the code we just wrote and write some new and better code in C#. We will still not touch NAV (only from Web Services). The goal here is to read the data from NAV into a table, with knowledge about the field types and add the NAV Key to a hidden column – all in order to be able to post data back to NAV.

My next post will also add a couple of buttons to the Excel toolbar and add the Save and Reload functions. It will still be hard coded to the Customer table though.

In my third post I will explain how to get parameters from code on the Service Tier into our Spreadsheet (like the filter) and hook things up accordingly.

Last, but not least, I will explain how we can do this without having a Web Reference to the Customer or other pages – how can we do this dynamically.

Stay tuned, enjoy

Freddy Kristiansen
PM Architect
Microsoft Dynamics NAV

Transferring binary data to/from WebServices (and to/from COM (Automation) objects)

A number of people have asked for guidance on how to transfer data to/from COM and WebServices in NAV 2009.

In the following I will go through how to get and set a picture on an item in NAV through a Web Service Connection.

During this scenario we will run into a number of obstacles – and I will describe how to get around these.

First of all – we want to create a Codeunit, which needs to be exposed to WebServices. Our Codeunit will contain 2 functions: GetItemPicture and SetItemPicture – but what is the data type of the Picture and how do we return that value from a WebService function?

The only data type (supported by Web Services) that can hold a picture is the BigText data type.

We need to create the following two functions:

GetItemPicture(No : Code[20];VAR Picture : BigText)
SetItemPicture(No : Code[20]; Picture : BigText);

BigText is capable if holding binary data (including null terminals) up to any size. On the WSDL side these functions will have the following signature:

image

As you can see BigText becomes string – and strings in .net are capable of any size and any content.

The next problem we will face is, that pictures typically contains all kinds of characters, and unfortunately when transferring strings to/from WebServices there are a number of special characters that are not handled in the WebServices protocol.

(Now you wonder whether you can have <> in your text – but that isn’t the problem:-)

The problem is LF, CR, NULL and other characters like that.

So we need to base64 (or something like that) encode our picture when returning it from Web Services. Unfortunately I couldn’t find any out-of-the-box COM objects that was able to do base64 encoding and decoding – but we can of course make one ourselves.

Lets assume for a second that we have a base64 COM object – then this would be our functions in AL:

GetItemPicture(No : Code[20];VAR Picture : BigText)
CLEAR(Picture);
Item.SETRANGE(Item.”No.”, No, No);
IF (Item.FINDFIRST()) THEN
BEGIN
  Item.CALCFIELDS(Item.Picture);
// Get Temp FileName
TempFile.CREATETEMPFILE;
FileName := TempFile.NAME;
TempFile.CLOSE;

  // Export picture to Temp File
Item.Picture.EXPORT(FileName);

  // Get a base64 encoded picture into a string
CREATE(base64);
Picture.ADDTEXT(base64.encodeFromFile(FileName));

  // Erase Temp File
FILE.ERASE(FileName);
END;

SetItemPicture(No : Code[20];Picture : BigText)
Item.SETRANGE(Item.”No.”, No, No);
IF (Item.FINDFIRST()) THEN
BEGIN
// Get Temp FileName
TempFile.CREATETEMPFILE;
FileName := TempFile.NAME;
TempFile.CLOSE;

  // Decode the bas64 encoded image into the Temp File
CREATE(base64);
base64.decodeToFile(Picture, FileName);

  // Import picture from Temp File
Item.Picture.IMPORT(FileName);
Item.Modify();
// Erase Temp File
FILE.ERASE(FileName);
END;

A couple of comments to the source:

  • The way we get a temporary filename in NAV2009 is by creating a temporary file, reading its name and closing it. CREATETEMPFILE will always create new GUID based temporary file names – and the Service Tier will not have access to write files in e.g. the C:\ root folder and a lot of other places.
  • The base64 automation object is loaded on the Service Tier (else it should be CREATE(base64, TRUE, TRUE);) and this is the right location, since the exported file we just stored is located on the Service Tier.
  • The base64.encodeFromFile reads the file and returns a very large string which is the picture base64 encoded.
  • The ADDTEXT method is capable of adding these very large strings and add them to a BigText (BTW – that will NOT work in the classic client).
  • We do the cleanup afterwards – environmental protection:-)

So, why does the ADDTEXT support large strings?

As you probably know, the ADDTEXT takes a TEXT and a position as parameter – and a TEXT doesn’t allow large strings, but what happens here is, that TEXT in C# becomes string – and the length-checking of TEXT variables are done when assigning variables or transferring parameters to functions and the ADDTEXT doesn’t check for any specific length (which comes in handy in our case).

The two lines in question in C# looks like:

base64.Create(DataError.ThrowError);
picture.Value = NavBigText.ALAddText(picture.Value, base64.InvokeMethod(@”encodeFromFile”, fileName));

Note also that the base64.decodeToFile function gets a BigText directly as parameter. As you will see, that function just takes an object as a parameter – and you can transfer whatever to that function (BigText, Text, Code etc.). You actually also could give the function a decimal variable in which case the function would throw an exception (str as string would return NULL).

So now you also know how to transfer large strings to and from COM objects:

  1. To the COM object, you just transfer a BigText variable directly to an object parameter and cast it to a string.
  2. From the COM object to add the string return value to a BigText using ADDTEXT.
  3. You cannot use BigText as parameter to a by-ref (VAR) parameter in COM.

In my WebService consumer project I use the following code to test my WebService:

// Initialize Service
CodeUnitPicture service = new CodeUnitPicture();
service.UseDefaultCredentials = true;

// Set the Image for Item 1100
service.SetItemPicture(“1100″, encodeFromFile(@”c:\MandalayBay.jpg”));

// Get and show the Image for Item 1001
string p = “”;
service.GetItemPicture(“1001″, ref p);
decodeToFile(p, @”c:\pic.jpg”);
System.Diagnostics.Process.Start(@”c:\pic.jpg”);

and BTW – the source code for the two functions in the base64 COM object are here:

public string encodeFromFile(string filename)
{
FileStream fs = File.OpenRead(filename);
BinaryReader br = new BinaryReader(fs);
int len = (int)fs.Length;
byte[] buffer = new byte[len];
br.Read(buffer, 0, len);
br.Close();
fs.Close();
return System.Convert.ToBase64String(buffer);
}

public void decodeToFile(object str, string filename)
{
FileStream fs = File.Create(filename);
BinaryWriter bw = new BinaryWriter(fs);
bw.Write(Convert.FromBase64String(str as string));
bw.Close();
fs.Close();
}

If you whish to download and try it out for yourself – you can download the sources here:

The two Visual Studio solutions can be downloaded from http://www.freddy.dk/VSDemo.zip (the base64 COM object and the VSDemo test project)

The NAV codeunit with the two functions above can be downloaded from http://www.freddy.dk/VSDemoObjects.fob.

Remember that after importing the CodeUnit you would have to expose it as a WebService in the WebService table:

image

And…. – remember to start the Web Service listener (if you are running with an unchanged Demo installation).

The code shown in this post comes with no warranty – and is only intended for showing how to do things. The code can be reused, changed and incorporated in any project without any further notice.

Comments or questions are welcome.

Enjoy

Freddy Kristiansen
PM Architect
Microsoft Dynamics NAV

Running Microsoft Dynamics NAV on SSD’s (Solid State Drives)

Solid State Drives are here. Laptops are sold with SSD’s (a bit expensive) and also the server market is seeing SSD’s coming out promising incredible performance from your storage.

But what does this mean for NAV?

I contacted FusionIO (www.fusionio.com), one of the major players in the high end SSD market and borrowed a drive for my server. The purpose of this was to test the performance of the drive in our NAV performance lab in Copenhagen. I also wanted to test whether the drive was reliable and easy to use / install.

The installation was a piece of cake: Open the server, Insert the card. Close the server, Install the driver – done!

Regarding reliability (after all I did get a beta version of the drivers) – I haven’t experienced one single problem with the server since installing the drive – so I guess that one gets a checkmark as well.

Executive summary

The solid state drive is dramatically faster in a number of the cold start scenarios – around 20-25% faster than the same test run on hard drives.

In other tests we see no a big difference which can be either because the SQL server makes extensive usage of caching or the test scenario is heavier on the CPU on the service tier.

In a few tests there is a small performance drop (typically in the magnitude of < 10ms for the test) – I expect this to be measurement inaccuracy.

Some reports, which are heavy on the database will also experience dramatic performance gain – again around 20-25%.

But the real difference is when we run 100 user tests – the picture is very clear, performance gain on a lot of the scenarios are 40%+

Buying a solid state drive for a company running 1-10 users seems overkill, it won’t hurt performance, but the more users you have on the system, the more performance gain you will get out of a drive like this.

Of course you will see the same picture of you have a smaller number of users but huge database or if you for other reasons have a large number of transactions.

Remember though that these solid state drives for servers is fairly new technology and priced at around $30/Gb (slightly more than a hard drive:-)) – prices always vary and these prices will probably be adjusted as we go along.

Initial testing

Before going to Copenhagen, I did a small test on the performance database (which contains a LOT of sales lines). I ran a SQL statement which calculated the SUM of all sales lines in the system.

On the harddrives this took approx. 45 seconds the first time (the second time, the cache would come into effect and the time was 1 second)

On the solid state drive – it took approx. 2 seconds the first time (and of course 1 second the second time).

But this of course doesn’t tell anything about NAV performance on these drives…

The server runs Windows 2003 server 64bit version with SQL Server 64bit – it has 8Gb of RAM, 3 * 500GB Hard drives (SATA – 7500rpm – one for system, one for data and one for SQL Server Log) and one 80GB FusionIO drive (which in the tests runs both Data and Logs for the databases on this drive).

The specs for the FusionIO drive are (from FusionIO’s marketing material):
– 700 MB/s read speed
– 600 MB/s write speed
– 87,500 IOPS (8K packets) transaction rate
– Sustain 3.2 GBps bandwidth with only four ioDrives
– 100,000 IOPS (4K packets) with a single ioDrive
– PCI-Express x4 Interface

BTW – these specifications are some of the fastest I have found on the market.

Intel launches a drive which is 250Mb/s read and 170Mb/s writes with 35000 IOPS and the small laptop 2.5” SSD’s from Imation specs at around 125Mb/s write and 80Mb/s reads with 12500 IOPS.

These drives will probably increase performance on demo laptops a lot – but are not suited for servers.

Texas Memory Systems have launched a series of SSD’s that matches the performance and as FusionIO – they are primarily focused on the server market – in fact if you look at the document about SQL Server performance on their drives (http://www.texmemsys.com/files/f000174.pdf) you will find a Danish AX customer (Pilgrim A/S) who is live on this technology and he states:

“Don’t be scared. The technology has proven itself superior in many circumstances. You just have to know when to apply it. For applications with smaller databases but heavy load, it’s a life saver”.

The purpose of this blog is not to point at any one particular provider of SSD’s – but I do want to mention that if you go for this – beware that performance specs on these things vary a lot – and from what I have seen, performance costs money.

Details

Note that I did absolutely nothing to improve the SQL performance on this machine – that is why we will run tests on this server both on drives and on solid state. The first rerun is the test run on the build in hard drives and the second rerun is on the solid state drive.

The Client and Service Tier computers are kept the same and only the location of the attached database it altered in order to know the difference in performance when switching to solid state.

Note also that these tests are NAV 2009 tests – I do think the picture for NAV Classic is similar when running multiple users though, since the multi user tests doesn’t include UI (that would be the Client tests) and really just measure the app and stack performance.

Details – Reporting scenarios

Of the report tests I will show two test results – a performance test doing Adjust Cost Item Entry and a Customer Statement. The first one hits the SQL Server performance a lot and shows very good performance on the 2nd rerun (the SSD) and the Customer Statement is more heavy on the Service Tier than on the SQL Server

clip_image002

This report is a batch job adjusting the Cost on Item entries – performance gain – around 25%

 

clip_image004

Customer statement report – the performance database doesn’t contain a lot of data, which affects the customer statement, the report with the given data isn’t hard on SQL.

Details – Client scenarios

Of the Client tests the major performance advantage comes when doing a cold start (this is where the Service Tier needs warm up – and this of course hits the SQL Server a lot). This shows us, that when running single user scenarios in a pre-heated (warm start) environment, we don’t hit the SQL server a lot (unless in some report or other special scenarios) – we probably knew this already.

clip_image006

This scenario is starting up the Client and opens an existing sales order – cold start – performance gain around 15%

The same scenario in a warm start shows no performance gain at all – probably because everything is cached.

clip_image008

Again Cold start scenario – 40% faster – the same scenario in warm start is only a fraction faster.

Details – multiple user tests

Now this is the area, where the solid state drive should be faster – this is where the SQL server gets hit more and where the caching on the SQL server cannot contain everything, and I think the perf tests do show, that this is the case.

I will let the number speak for themselves on the important performance scenarios:

clip_image010

40% faster

clip_image012

40% faster

clip_image014

40% faster

clip_image016

80% faster

clip_image018

50% faster

clip_image020

50% faster

I think all of these are showing dramatic performance gain – of 40% or more when running  with the solid state drive, and I think that this shows that the technology can do a lot for some NAV customers.

I also do think that AX will we similar or better results running Solid State especially with large number of users (which have been confirmed by the report from Texas Memory Systems).

I do think the SSD technology has arrived and when pricing gets right – they will become very popular on the market. I think we will see them take over the laptop market – but I also do think that these tests shows that some providers are ready for the server marked as well. I think the major obstacle right now is that people somehow trust their data on hard drives more than on a SSD – but I think that will change as we get to know the technology better.

A special thanks to the performance team in Copenhagen for helping out and a special thanks to FusionIO for lending me a IODrive on which I could perform the test.

Enjoy

Freddy Kristiansen
PM Architect
Microsoft Dynamics NAV