Word Management

As with the release of Microsoft Dynamics NAV 2009, I was also deeply involved in the TAP for Microsoft Dynamics NAV 2009 SP1. My primary role in the TAP is to assist ISVs and partners in getting a customer live on the new version before we ship the product.

During this project we file a lot of bugs and the development team in Copenhagen are very responsive and we actually get a lot of bugs fixed – but… not all – it happens that a bug is closed with “By Design”, “Not Repro” or “Priority too low”.

As annoying as this might seem, I would be even more annoyed if the development team would take every single bug, fix it, run new test passes and punt the releases into the unknown. Some of these bugs then become challenges for me and the ISV / Partner to solve, and during this – it happens that I write some code and hand off to my contact.

Whenever I do that, two things are very clear

  1. The code is given as is, no warranty, no guarantee
  2. The code will be available on my blog as well, for other ISV’s and partners to see

and of course I send the code to the development team in Copenhagen, so that they can consider the fix for the next release.

Max. 64 fields when merging with Word

One of the bugs we ran into this time around was the fact that when doing merge with Microsoft Word in a 3T environment, word would only accept 64 merge fields. Now in the base application WordManagement (codeunit 5054) only uses 48 fields, but the ISV i was working with actually extended that to 100+ fields.

The bug is in Microsoft Word, when merging with file source named .HTM – it only accepts 64 fields, very annoying.

We also found that by changing the filename to .HTML, then Word actually could see all the fields and merge seemed to work great (with one little very annoying aberdabei) – the following dialog would pop up every time you open Word:

clip_image002

Trying to figure out how to get rid of the dialog, I found the right parameters to send to Word.OpenDataSource, so that the dialog would disappear – but… – then we are right back with the 64 fields limitation.

The reason for the 64 field limitation is, that Word loads the HTML as a Word Document and use that word document to merge with and in a document, you cannot have more than 64 columns in a table (that’s at least what they told me).

I even talked to PM’s in Word and got confirmed that this behavior was in O11, O12 and would not be fixed in O14 – so no rescue in the near future.

Looking at WordManagement

Knowing that the behavior was connected to the merge format, I decided to try and change that – why not go with a good old fashion .csv file instead and in my quest to learn AL code and application development, this seemed like a good little exercise.

So I started to look at WordManagement and immediately found a couple of things I didn’t like

MergeFileName := RBAutoMgt.ClientTempFileName(Text029,’.HTM’);
IF ISCLEAR(wrdMergefile) THEN
CREATE(wrdMergefile,FALSE,TRUE);
// Create the header of the merge file
CreateHeader(wrdMergefile,FALSE,MergeFileName);
<find the first record>
REPEAT
// Add Values to mergefile – one AddField for each field for each record
  wrdMergefile.AddField(<field value>);
  // Terminate the line
wrdMergefile.WriteLine;
UNTIL <No more records>
// Close the file
wrdMergefile.CloseFile;

now wrdMergefile is a COM component of type ‘Navision Attain ApplicationHandler’.MergeHandler and as you can see, it is created Client side, meaning that for every field in every record we make a roundtrip to the Client (and one extra roundtrip for every record to terminate the line) – now we might not have a lot of records nor a lot of fields, but I think we can do better (said from a guy who used to think about clock cycles when doing assembly instructions on z80 processors back in the start 80’s – WOW I am getting old:-))

One fix for the performance would be to create the file serverside and send it to the Client in one go – but that wouldn’t solve our original 64 field limitation issue. I could also create a new COM component, which was compatible with MergeHandler and would write a .csv instead – but that wouldn’t solve my second issue about wanting to learn some AL code.

Creating a .csv in AL code

I decided to go with a model, where I create a server side temporary file for each record, create a line in a BigText and write it to the file. After completing the MergeFile, it needs to be downloaded to the Client and deleted from the service tier.

The above code would change into something like

MergeFileName := CreateMergeFile(wrdMergefile);
wrdMergefile.CREATEOUTSTREAM(OutStream);
CreateHeader(OutStream,FALSE); // Header without data
<find the first record>
REPEAT
CLEAR(mrgLine);
// Add Values to mergefile – one AddField for each field for each record
AddField(mrgCount, mrgLine, <field value>);
  // Terminate the line
mrgLine.ADDTEXT(CRLF);
  mrgLine.WRITE(OutStream);
CLEAR(mrgLine);
UNTIL <No more records>
// Close the file
wrdMergeFile.Close();
MergeFileName := WordManagement.DownloadAndDeleteTempFile(MergeFileName);

As you can see – no COM components, all server side. A couple of helper functions are used here, but no rocket science and not too different from the code that was.

CreateMergeFile creates a server side temporary file.

CreateMergeFile(VAR wrdMergefile : File) MergeFileName : Text[260]
wrdMergefile.CREATETEMPFILE;
MergeFileName := wrdMergefile.NAME + ‘.csv’;
wrdMergefile.CLOSE;
wrdMergefile.TEXTMODE := TRUE;
wrdMergefile.WRITEMODE := TRUE;
wrdMergefile.CREATE(MergeFileName);

AddField adds a field to the BigText. Using AddString, which again uses DupQuotes to ensure that “ inside of the merge field are doubled.

AddField(VAR count : Integer;VAR mrgLine : BigText;value : Text[1024])
IF mrgLine.LENGTH = 0 THEN
BEGIN
count := 1;
END ELSE
BEGIN
count := count + 1;
mrgLine.ADDTEXT(‘,’);
END;
mrgLine.ADDTEXT(‘”‘);
AddString(mrgLine, value);
mrgLine.ADDTEXT(‘”‘);

AddString(VAR mrgLine : BigText;str : Text[1024])
IF STRLEN(str) > 512 THEN
BEGIN
mrgLine.ADDTEXT(DupQuotes(COPYSTR(str,1,512)));
str := DELSTR(str,1,512);
END;
mrgLine.ADDTEXT(DupQuotes(str));

DupQuotes(str : Text[512]) result : Text[1024]
result := ”;
REPEAT
i := STRPOS(str, ‘”‘);
IF i <> 0 THEN
BEGIN
result := result + COPYSTR(str,1,i) + ‘”‘;
str := DELSTR(str,1,i);
END;
UNTIL i = 0;
result := result + str;

and a small function to return CRLF (line termination for a merge line)

CRLF() result : Text[2]
result[1] := 13;
result[2] := 10;

When doing this I did run into some strange errors when writing both BigTexts and normal Text variables to a stream – that is the reason for building everything into a BigText and writing once pr. line.

and last, but not least – a function to Download a file to the Client Tier and delete it from the Service Tier:

DownloadAndDeleteTempFile(ServerFileName : Text[1024]) : Text[1024]
IF NOT ISSERVICETIER THEN
EXIT(ServerFileName);

FileName := RBAutoMgt.DownloadTempFile(ServerFileName);
FILE.ERASE(ServerFileName);
EXIT(FileName);

It doesn’t take much more than that… (beside of course integrating this new method in the various functions in WordManagement). The fix doesn’t require anything else than just replacing codeunit 5054 and the new WordManagement can be downloaded here.

Question is now, whether there are localization issues with this. I tried changing all kinds of things on my machine and didn’t run into any problems – but if anybody out there does run into problems with this method – please let me know so.

What about backwards compatibility

So what if you install this codeunit into a system, where some of these merge files already have been created – and are indeed stored as HTML in blob fields?

Well – for that case, I created a function that was able to convert them – called

ConvertContentFromHTML(VAR MergeContent : BigText) : Boolean

It isn’t pretty – but it seems to work.

Feedback is welcome

I realize that by posting this, I am entering a domain where I am the newbie and a lot of other people are experts. I do welcome feedback on ways to do coding, things I can do better or things I could have done differently.

 

Enjoy

Freddy Kristiansen
PM Architect
Microsoft Dynamics NAV

Edit In Excel R2 – Part 1 (out of 2)

This post assumes that you have read the 4 step walkthrough of how to build the Edit In Excel demo from November 2008. You can find the parts here: Part 1, Part 2, Part 3, Part 4 and the Bug Fix.

In this post I will talk about what is needed in order to be able to save an Excel spreadsheet on a local disc, edit it offline and then submit your changes later.

My first assumption was that this was just a few small changes, but I should learn more…

Goal

The success scenario is the one where a NAV user decides to take the customer table offline in Excel to do modifications. The Excel spreadsheet is saved to a local hard drive and edited while not connected to the network.

When the user reconnects to the network, he can submit his changes, which will be send to NAV through Web Services and validated according to business rules.

The location of NAVTemplate.vsto

Let’s just try to save the spreadsheet in our Documents folder and see what happens if we open the spreadsheet. Not surprisingly we get an error telling us that it was unable to locate NAVTemplate.vsto

image

An immediate solution is to save the spreadsheet next to the .vsto and then it seems to work better, but the spreadsheet is not attached to NAV anymore, we cannot save changes and reload crashes with an exception.

The .vsto file is our deployment manifest, which is our managed code extension to the spreadsheet and of course this is needed in order for the spreadsheet to work.

You can read more about the architecture of Document-Level Customizations here:

http://msdn.microsoft.com/en-us/library/zcfbd2sk.aspx

Looking at the ServerDocument interface,which we use in the NAVEditInExcel solution, it has a property called DeploymentManifestUri, which is the location of the .vsto file. Adding the following line to the NAVEditInExcel project

serverDoc.DeploymentManifestUrl = new Uri(System.IO.Path.Combine(System.IO.Path.GetDirectoryName(Assembly.GetExecutingAssembly().Location), “NAVTemplate.vsto”));

will cause the document to be created with an absolute reference to the NAVTemplate.vsto, and this will solve the problem with the spreadsheet not being able to locate the .vsto. In fact if this is a network location, it should even be possible to send the Excel spreadsheet to somebody else who should be able to modify the document and post the changes.

When doing this I decided to make another change in the NAVEditInExcel project. As you know it creates a temporary template in the same location as the DLL and the VSTO – and then it deletes it again. This is really not the preferred location of temporary files – we should create temporary files in the Temp folder so we change the line setting the template name to:

string template = System.IO.Path.Combine(System.IO.Path.GetTempPath(), page + “.xltx”);

since we don’t need to have the .xltx next to the .vsto anymore.

What do we need to save?

Members of the Sheet1 class marked with [Cached] will be saved together with the document – we know that from the way we transfer the page name and the current view to the spreadsheet. Thinking about it, the only things we need to save together with the document is the dataTable (which at any time contains the changes made by the user) and the objects collection (which are the objects returned from the Web Service).

The DataTable class implements IXmlSerializable and the objects collection is (as we know) an array of objects returned from a web Service provider and since these objects where send over the wire from a Web Service they of course also implements IXmlSerializable.

The fields collection cannot be saved, since the NAVFieldInfo class uses the PropertyInfo class, which cannot be serialized. The Service connection of course cannot be serialized either – nor can the datalist class – as the matter of fact, the datalist class shouldn’t be a member at all, it should be moved to the AddDataToExcel method as a local variable.

Problem now is, that if we just mark the dataTable and objects members with [Cached] we need to initialize them in the NAVEditInExcel project and there is no way we can instantiate the objects collection at that time with the right type.

A little more on cached data objects in Office documents

http://msdn.microsoft.com/en-us/library/ms178808(VS.80).aspx

and then how to programmatically add members to the cached objects

http://msdn.microsoft.com/en-us/library/48b7eyf3(VS.80).aspx

using this knowledge gives us the following two lines, we want to add after we have instantiated the dataTable and the objects array.

// Add dataTable and objects to the Caching collection
this.StartCaching(“dataTable”);
this.StartCaching(“objects”);

Now we just need to determine that the spreadsheet was loaded from a file (and not started from NAV) and then act differently.

Refactoring some code

We need to do a little refactoring of code in order to make things work. In the existing solution the PopulateFieldsCollection method creates the fields collection, but it also creates an empty dataTable class. Since we now store the dataTable class and not the fields collection we need as the first thing in the new spreadsheet to create the fields collection (a lot of things is depending on this). This is the new PopulateFieldsCollection (and AddField):

/// <summary>
/// Populate Fields Collection with NAVFieldInfo for all properties in the record
/// Should works with any NAV 2009 Page exposed as WebService
/// </summary>
/// <param name=”objType”>Type of Object (typeof(Customer), typeof(Vendor), …)</param>
/// <param name=”fieldsType”>Type of the Enum holding the property names</param>
private void PopulateFieldsCollection(Type objType, Type fieldsType)
{
this.fields = new List<NAVFieldInfo>();

    // Key property is not part of the Enum
// Add it manually as the first field
AddField(“Key”, objType);

    // Run through the enum and add all fields
foreach (string field in Enum.GetNames(fieldsType))
{
AddField(field, objType);
}
}

/// <summary>
/// Create a NAVFieldInfo for a field
/// </summary>
/// <param name=”field”>Field name</param>
/// <param name=”objType”>Type of Object in which the field is (typeof(Customer), typeof(Vendor), …)</param>
private void AddField(string field, Type objType)
{
field = NAVFilterHelper.VSName(field);
PropertyInfo pi = objType.GetProperty(field, System.Reflection.BindingFlags.Public | System.Reflection.BindingFlags.Instance);
if (pi != null)
{
NAVFieldInfo nfi = NAVFieldInfo.CreateNAVFieldInfo(field, pi, objType.Namespace);
if (nfi != null)
{
// If we encounter unknown Field Types, they are just ignored
this.fields.Add(nfi);
}
}
}

The Load function in the old spreadsheet does a number of things. It populates the Fields collection, loads the data, populates the dataTable and add the dataTable to the spreadsheet.

We will remove and create one function called LoadDataTable, which will create a dataTable (based on the fields collection), load the data from Web Services and populate the dataTable.

/// <summary>
/// Load Records from NAV via Web Services
/// </summary>
private void LoadDataTable()
{
// Create Data Table object based on fields collection
this.dataTable = new DataTable(this.page);
foreach (NAVFieldInfo nfi in this.fields)
{
this.dataTable.Columns.Add(new DataColumn(nfi.field, nfi.fieldType));
}

    SetFilters(this.view);
this.objects = this.service.ReadMultiple();
// Populate dataTable with data
foreach (object obj in this.objects)
{
DataRow dataRow = this.dataTable.NewRow();
foreach (NAVFieldInfo nfi in this.fields)
{
dataRow[nfi.field] = nfi.GetValue(obj);
}
this.dataTable.Rows.Add(dataRow);
}
this.dataTable.AcceptChanges();
}

As you can see, this is pieces of the PopulateFieldsCollection, Load and PopulateDataTable functions – as the matter of fact, you can delete the PopulateDataTable function as well. BTW the AcceptChanges call was moved from AddDataToExcel it needs to be together with the code populating the dataTable.

Right now my startup code in my spreadsheet has changed to

if (this.service != null)
{
this.service.UseDefaultCredentials = true;

    // Create Fields collection
this.PopulateFieldsCollection(this.service.GetObjectType(), this.service.GetFieldsType());

    Application.ScreenUpdating = false;

    bool loadData = this.ListObjects.Count == 0;
if (!loadData)
{
MessageBox.Show(“Spreadsheet loaded from disc”);
}
if (loadData)
{
// Load Data into dataTable from Web Services
this.LoadDataTable();
// Add dataTable to Excel Spreadsheet
this.AddDataTableToExcel();

        // Add dataTable and objects to the Caching collection
this.StartCaching(“dataTable”);
this.StartCaching(“objects”);
}

    Application.ScreenUpdating = true;
}

from just invoking Load() before.

So we populate the fields collection and then we check whether or not there is a ListObject in the document (remember the Controls collection was empty). If this is the case we must do something (for now we just display a messagebox).

If we are called from NAV (loadData becomes true) we will load the data and call AddDataTableToExcel (renamed from AddDataToExcel) and that should work.

If we try to compile now, we will see that the Reload() method uses Load() as well. We need to change Reload to

/// <summary>
/// Reload data from NAV (delete old dataTable, and load new data)
/// </summary>
internal void Reload()
{
Application.ScreenUpdating = false;

    // Remove List Object
if (this.dataTable != null)
this.Controls.RemoveAt(0);
else
this.ListObjects[1].Delete();

    // Load Data into dataTable from Web Services
this.LoadDataTable();
// Add dataTable to Excel Spreadsheet
this.AddDataTableToExcel();

    // If this reload was in fact a reattach of the spreadsheet, start caching dataTable and objects again
if (!this.IsCached(“dataTable”))
{
// Add dataTable and objects to the Caching collection
this.StartCaching(“dataTable”);
this.StartCaching(“objects”);
}

    Application.ScreenUpdating = true;
}

Note that we remove the old listobject in two different ways based on whether or not dataTable is set. dataTable is null if the spreadsheet has been detached from NAV – I will touch more upon that later. This is also the reason why we restart Caching the dataTable and objects if in fact this was a reattach.

The solution should work as before now – the only major difference is, that if you save the spreadsheet on the disc and try to load it again it should not give an exception telling you, that you cannot overlap a table with another table, instead it should give you something like:

image

This is of course not the final solution, but it shows us that we are on the right track.

Restoring the “State” of the spreadsheet

To make a long story short, the lines we need in order to restore the managed list object are:

// Remove Non-VSTO List Object
this.ListObjects[1].Delete();
// Create a new VSTO ListObject – data bound
this.AddDataTableToExcel();

Meaning that we remove the unmanaged ListObject and we add the managed ListObject to Excel, seems pretty easy. But what if the document is saved on disc and you add a field to the Customer Page (+ update your web reference and recompile your NAVTemplate) then the managed extension assembly doesn’t match the saved spreadsheet anymore and the above logic wouldn’t work.

In many cases we could just say that we don’t care – but given the ability to save spreadsheets that are connected to Web Services and reload data adds another dimension to the entire Excel thing. You can have spreadsheets that contain a lot of other things than your dataTable and you might not be pleased with the fact that you loose the NAV Web Service connection if this happens.

I decided to build in a way of determining this and give the user a couple of options:

// If the spreadsheet was detached already – just ignore
if (this.IsCached(“dataTable”))
{
// We have loaded a saved spreadsheet with data
// Check that the VSTO assembly (fields) matches the spreadsheet
bool fieldsOK = this.dataTable.Columns.Count == this.fields.Count;
if (fieldsOK)
{
for (int i = 0; i < this.fields.Count; i++)
if (this.dataTable.Columns[i].Caption != this.fields[i].field)
fieldsOK = false;
}
if (!fieldsOK)
{
// Schema mismatch – cannot link back to NAV
switch (MessageBox.Show(“Customer Card definition has changed since this spreadsheet was save. Do you want to re-establish link, reload data and loose changes?”, “Error”, MessageBoxButtons.YesNoCancel))
{
case DialogResult.Cancel:
// Quit
Application.Quit();
break;
case DialogResult.Yes:
// Remove Non-VSTO List Object
this.ListObjects[1].Delete();
// Signal reload data and reestablish link
loadData = true;
                this.StopCaching(“dataTable”);
this.StopCaching(“objects”);
                break;
case DialogResult.No:
// Detach spreadsheet from NAV
              this.dataTable = null;
              this.objects = null;
              this.StopCaching(“dataTable”);
              this.StopCaching(“objects”);
                break;
}

    }
else
{
// Remove Non-VSTO List Object
this.ListObjects[1].Delete();
// Create a new VSTO ListObject – data bound
this.AddDataTableToExcel();

}
}

3 options – Cancel quits the spreadsheet open command and no harm has been done, Yes removes the old table and sets the loadData to true (meaning that the spreadsheet will reload data as if it was opened from NAV) and No will detatch the spreadsheet from NAV (setting the dataTable to null and stop caching dataTable and objects). Note that if you ever press Reload it will re-attach the spreadsheet and load data from NAV again (the reason for checking whether the dataTable was null in Reload).

Yes or No will of course not touch the original spreadsheet and you can always save in a new name.

BTW – we need one simple check in Save() as well

/// <summary>
/// Save Changes to NAV via Web Service
/// </summary>
internal void Save()
{
if (this.dataTable == null)
{
MessageBox.Show(“Spreadsheet was detached from NAV, cannot perform Save!”);
}
else if (DoSave())
{
Reload();
}
}

The only extra tihing you need is to add the following line right before the return statement in DoSave():

this.dataTable.AcceptChanges();

Without this line you cannot edit the data, save changes to NAV, edit more data and then save locally.

Done

We are done – we now have a template you can use for modifying NAV data. You can save the spreadsheet locally and modify it while you are offline and later, when you come back online you can push your changes to NAV and this is when validation happens.

As usual you can download the project here http://www.freddy.dk/NAVTemplateR2.zip – there are no changes to what’s needed in NAV, so you should use the objects from the original Edit In Excel walkthrough inside C/AL.

BTW. If you decide to use the source, you probably need to right-click the Web References and invoke Update Web Reference to make sure that the Web Reference matches your Customer, Item and Vendor Card Pages.

Part 2?

So why is this a part 1 out of  2 you might ask yourself right now?

Reason is that I want to create a better conflict resolution in the spreadsheet. With the solution we have build now, any change to a record by anybody will cause you to loose your changes in Excel. Wouldn’t it be nice if we were able to determine that we only changed the location code in Excel, so just because somebody changed the credit limit on a couple of customers, we should still be able to re-apply our changes without having to type them in again.

This is what R2 step 2 is all about – a better conflict resolution mechanism.

Enjoy

Freddy Kristiansen
PM Architect
Microsoft Dynamics NAV

Transferring binary data to/from WebServices (and to/from COM (Automation) objects)

A number of people have asked for guidance on how to transfer data to/from COM and WebServices in NAV 2009.

In the following I will go through how to get and set a picture on an item in NAV through a Web Service Connection.

During this scenario we will run into a number of obstacles – and I will describe how to get around these.

First of all – we want to create a Codeunit, which needs to be exposed to WebServices. Our Codeunit will contain 2 functions: GetItemPicture and SetItemPicture – but what is the data type of the Picture and how do we return that value from a WebService function?

The only data type (supported by Web Services) that can hold a picture is the BigText data type.

We need to create the following two functions:

GetItemPicture(No : Code[20];VAR Picture : BigText)
SetItemPicture(No : Code[20]; Picture : BigText);

BigText is capable if holding binary data (including null terminals) up to any size. On the WSDL side these functions will have the following signature:

image

As you can see BigText becomes string – and strings in .net are capable of any size and any content.

The next problem we will face is, that pictures typically contains all kinds of characters, and unfortunately when transferring strings to/from WebServices there are a number of special characters that are not handled in the WebServices protocol.

(Now you wonder whether you can have <> in your text – but that isn’t the problem:-)

The problem is LF, CR, NULL and other characters like that.

So we need to base64 (or something like that) encode our picture when returning it from Web Services. Unfortunately I couldn’t find any out-of-the-box COM objects that was able to do base64 encoding and decoding – but we can of course make one ourselves.

Lets assume for a second that we have a base64 COM object – then this would be our functions in AL:

GetItemPicture(No : Code[20];VAR Picture : BigText)
CLEAR(Picture);
Item.SETRANGE(Item.”No.”, No, No);
IF (Item.FINDFIRST()) THEN
BEGIN
  Item.CALCFIELDS(Item.Picture);
// Get Temp FileName
TempFile.CREATETEMPFILE;
FileName := TempFile.NAME;
TempFile.CLOSE;

  // Export picture to Temp File
Item.Picture.EXPORT(FileName);

  // Get a base64 encoded picture into a string
CREATE(base64);
Picture.ADDTEXT(base64.encodeFromFile(FileName));

  // Erase Temp File
FILE.ERASE(FileName);
END;

SetItemPicture(No : Code[20];Picture : BigText)
Item.SETRANGE(Item.”No.”, No, No);
IF (Item.FINDFIRST()) THEN
BEGIN
// Get Temp FileName
TempFile.CREATETEMPFILE;
FileName := TempFile.NAME;
TempFile.CLOSE;

  // Decode the bas64 encoded image into the Temp File
CREATE(base64);
base64.decodeToFile(Picture, FileName);

  // Import picture from Temp File
Item.Picture.IMPORT(FileName);
Item.Modify();
// Erase Temp File
FILE.ERASE(FileName);
END;

A couple of comments to the source:

  • The way we get a temporary filename in NAV2009 is by creating a temporary file, reading its name and closing it. CREATETEMPFILE will always create new GUID based temporary file names – and the Service Tier will not have access to write files in e.g. the C:\ root folder and a lot of other places.
  • The base64 automation object is loaded on the Service Tier (else it should be CREATE(base64, TRUE, TRUE);) and this is the right location, since the exported file we just stored is located on the Service Tier.
  • The base64.encodeFromFile reads the file and returns a very large string which is the picture base64 encoded.
  • The ADDTEXT method is capable of adding these very large strings and add them to a BigText (BTW – that will NOT work in the classic client).
  • We do the cleanup afterwards – environmental protection:-)

So, why does the ADDTEXT support large strings?

As you probably know, the ADDTEXT takes a TEXT and a position as parameter – and a TEXT doesn’t allow large strings, but what happens here is, that TEXT in C# becomes string – and the length-checking of TEXT variables are done when assigning variables or transferring parameters to functions and the ADDTEXT doesn’t check for any specific length (which comes in handy in our case).

The two lines in question in C# looks like:

base64.Create(DataError.ThrowError);
picture.Value = NavBigText.ALAddText(picture.Value, base64.InvokeMethod(@”encodeFromFile”, fileName));

Note also that the base64.decodeToFile function gets a BigText directly as parameter. As you will see, that function just takes an object as a parameter – and you can transfer whatever to that function (BigText, Text, Code etc.). You actually also could give the function a decimal variable in which case the function would throw an exception (str as string would return NULL).

So now you also know how to transfer large strings to and from COM objects:

  1. To the COM object, you just transfer a BigText variable directly to an object parameter and cast it to a string.
  2. From the COM object to add the string return value to a BigText using ADDTEXT.
  3. You cannot use BigText as parameter to a by-ref (VAR) parameter in COM.

In my WebService consumer project I use the following code to test my WebService:

// Initialize Service
CodeUnitPicture service = new CodeUnitPicture();
service.UseDefaultCredentials = true;

// Set the Image for Item 1100
service.SetItemPicture(“1100″, encodeFromFile(@”c:\MandalayBay.jpg”));

// Get and show the Image for Item 1001
string p = “”;
service.GetItemPicture(“1001″, ref p);
decodeToFile(p, @”c:\pic.jpg”);
System.Diagnostics.Process.Start(@”c:\pic.jpg”);

and BTW – the source code for the two functions in the base64 COM object are here:

public string encodeFromFile(string filename)
{
FileStream fs = File.OpenRead(filename);
BinaryReader br = new BinaryReader(fs);
int len = (int)fs.Length;
byte[] buffer = new byte[len];
br.Read(buffer, 0, len);
br.Close();
fs.Close();
return System.Convert.ToBase64String(buffer);
}

public void decodeToFile(object str, string filename)
{
FileStream fs = File.Create(filename);
BinaryWriter bw = new BinaryWriter(fs);
bw.Write(Convert.FromBase64String(str as string));
bw.Close();
fs.Close();
}

If you whish to download and try it out for yourself – you can download the sources here:

The two Visual Studio solutions can be downloaded from http://www.freddy.dk/VSDemo.zip (the base64 COM object and the VSDemo test project)

The NAV codeunit with the two functions above can be downloaded from http://www.freddy.dk/VSDemoObjects.fob.

Remember that after importing the CodeUnit you would have to expose it as a WebService in the WebService table:

image

And…. – remember to start the Web Service listener (if you are running with an unchanged Demo installation).

The code shown in this post comes with no warranty – and is only intended for showing how to do things. The code can be reused, changed and incorporated in any project without any further notice.

Comments or questions are welcome.

Enjoy

Freddy Kristiansen
PM Architect
Microsoft Dynamics NAV