Quantcast
Channel: Dynamic Code Blocks
Viewing all 195 articles
Browse latest View live

Dynamics GP Web Client is not compatible with Docker due to the Session Central Service not starting

$
0
0

Back in 2017 I attended NAVTechDays and after seeing the benefits the Dynamics NAV community were getting from Docker containers with NAV, I could see similar benefits for Dynamics GP and was also seeing interest in some of the conversations going on out of public within the GP community. I decided to see if I could get a good selection of Dynamics GP versions into Docker. Word of that work spread and brought out more people interested in containers for GP.

I spent a lot of hours attempting to learn Docker and to containerise Dynamics GP. I could build Docker images for the GP server database and add in various of the GP components like eConnect and oData. Sadly I was unable to get the Dynamics GP Web Client working – I really was just a cat's whisker away from success too. I was defeated by the Domain Account the Web Client runs under and its use of System.DirectoryServices.AccountManagement .  The following is a record of what I found when investigating that issue.

Although I can create a Docker container for the database and test company and even things like eConnect, there is a big problem. You see, there is no GUI with Docker containers, you can’t just remote desktop into one, you really need the Web Client running in order to make a self contained system to allow the host to interact with the Dynamics GP application in the container. If you can't access it from a browser, then you end up having to install the GP client on a non-docker machine (host) and connect using it to the running Docker container. Although this works its not really in the spirit of what I was trying to achieve, which was a fully self contained container for GP to allow multiple versions of GP to be easily spun up for testing and demo purposes. 

Why I failed?

After the Docker image has installed GP and the the web client product features are installed, one of these, the Session Central Service fails to start within the container. The Session Central Service is the component of the Dynamics GP web client installation that creates and tracks individual web client sessions as users connect and disconnect via web browsers. It is essential to getting web client to work.

At this point let me jump tight to the core issue, essentially the problem is that the “Lanman” windows service can’t be started in a Windows container. The Lanman service provides things like SMB file share services and is also referred to as the “Sever Service”, as that is how it appears, named in the services control panel list of services.

Containers are supposed to be stateless isolated single-app process groups, not mini virtual machines, so running Lanman would break that philosophy, hence it is not appropriate for it to be supported in Windows Docker containers as it doesn't make sense. Containers are supposed to be created and torn down at the drop of a hat, participating in active directory goes against the grain. This is why, along with a few other Windows features that are inappropriate for containers, we find the Lanman service just isn’t supported in a Windows Container.

So why is this a problem?

You find that the session central service, checks for the user running, see (2) in the following screenshot, and it does this using a call to the following method:

Dynamics.GP.Web.Foundation.DirectoryServices.PrincipalManager.GetPrincipal

This method in turn calls .NET method:

System.DirectoryServices.AccountManagement.Principal.FindByIdentityWithType(PrincipalContext  context, Type principalType, IdentityType identityType, String identityValue)

It is the above call that fails, and looking in the event log we see it reporting as indicated by (1) in the screenshot below that:

Exception message:
                      The Server service is not started.

Possible Root cause sml

Hey? - "The Server service is not started"– a confusing message as it means that the Lanman Service (aka server service) is not started. Yes this was super confusing error at the start of this investigation, not realising there was a "server service", so I assumed it was just a poor error message that wasn't telling me the service that had failed. So once I realised what it meant, I found out that, indeed if you go into the Docker container with Power Shell you can check that the Server Service is not started and even try to start it, however it will not start,  and this is due to the lack of support in windows containers, as previously stated. Lanman service can't be run on a windows container, thus preventing the use of .NETs System.DirectoryServices.AccountManagement. Microsoft in the past also confirm there are no plans for this to ever change, as it is fundamentally at odds with container philosophy to have this service running.

Without a small change to the Session Central Service code I was not going to get anywhere with this until…

Docker and Group Managed Service Account (gMSAs)

Hope then followed when I then discovered about the existence of Group Managed Service Account (gMSA), that I thought may provide a route to get around the lack of Lanman support. Using it we can authenticate to Active Directory resources from Windows container which is not part of your domain (remember the container is not part of domain), using gMSA. gMSA is a special account that can be used to (over simplifying it) allow passing over a domain account for the purposes of running a service. This could be perfect for replacing the user running the session central service, instead it can ran using a gMSA account for authentication. That in turn would bypass Lanman. When you run a container with a gMSA, the container host retrieves the gMSA password from an Active Directory domain controller and gives it to the container instance. The container will use the gMSA credentials whenever its computer account (SYSTEM) needs to access network resources. In our case, when building the Docker container we can use Power Shell to change the Session Central service to use the SYSTEM account having prepared the plumbing required for gMSA to authenticate against AD. SYSTEM become a kind of proxy for the AD account behind the gMSA account. For this to work certain prerequisites needs to be met, there is quite a bit of learning that is more than I want to get into here. I built this up in Docker to test.

Sadly on testing this thoroughly for a few days, it turns out there is another a problem. A docker container can access AD using the gMSAsaccount, allowing FindByIdentity to work WHEN the constructor for PrincipalContext is created with “ContextType.Domain” and specifying an accessible AD. The group managed service account technique does not work when the context type of “ContextType.Machine” is used to create the PrincipalContext.  

If we look at what is going on in the class in the screen shot– we find that the class is created with a constructor specifying machine as the context, stumping our attempts to use the gMSA to get around the lack of Lanman. This is due to no user name getting passed to the isMachineContext function (see last screen shot). I'm guessing that using the built in NETWORK user means no user name is passed into this method thus resulting in the highlighted code (1) being ran in the following screen shot.  This was upsetting to find as I thought I had it cracked at this point.

Not all that learning is a loss as it was interesting to learn about gMSA as it is an important container and enterprise computing concept to understand.

The following screenshot shows the code causing our issue (1) in the Microsoft.Dynamics.GP.Web.Foundation.DirectoryServices.PrincipalManager.

 

GetPrincipal2

 

User name being blank when calling the following IsMachineContext method then means the split in the logic in the above screen shot causes it to create the PrincipalContext using the wrong context type to use gMSA.

ismachinecontext

Finally


We can create containers with the database and test company but sadly we end up having to install the GP client on the host to access the container, that was not really my objective.

 Would be amazing if the code could be updated by Microsoft but I don’t see this happening. I need to refresh myself on the exact details of this issue and check again that nothing has changed in the years since I was trying as Docker is getting constantly improved with new features. 


Dynamics GP 2015 email–”Word experienced an error trying to open the file error sending email” when using Word Templates

$
0
0

When attempting to send a document that utilises word templates, even using PDF format, the following error may occur, due to a tightening in security with in the Microsoft Office product, “Word experienced an error trying to open the file”.

Word experienced an error trying to open the file. Try these suggestions Check the file permissions for the doucment or drve


Information about this issue can be found here: Word Templates will not Email/Print after Office Update

As I understand it the XML generated by Dynamics GP is not compliant and causes office versions with the patch to fail to load it. This is true also if you send the word document as a word document to someone, that was generated in Dynamics GP. If they have patched office, the document will not open.  When Dynamics GP sends a PDF/Word email it saves the document into a temp directory first then sends it, turning it into a PDF as required. This is why even PDF documents are affected too.

Solution for GP versions after 2016

The solution is to update with the latest GP patches if you are on supported version of GP 2016 and above (at time of writing).

Those updates found here:

Microsoft Dynamics GP U.S. Year-End Update 2020 RELEASED!!


Bodge solution for GP version 2015 – maybe previous version too, I’ve not tested

If you are running unsupported GP2015, then you are stuck without a solution, other than the correct one, that you really should be upgrading to a supported version of GP f9or these exact kind of reasons). In the real world there are many reasons why this may not be possible quite yet and you don’t want to postpone rolling out office updates, as for security reasons, you really shouldn’t hold back on updates.

There is a totally unsupported bodge you can use that may give you some time to upgrade whilst still benefiting from the latest office updates.  Please note this action should be considered carefully as it has not been talked about by Microsoft as an option.


Fix Dynamics GP 2015 Word experienced an error issue

To fix this issue for Dynamics 2015, in a totally unsupported way – this is at your own risk!

Download GP2016, more importantly, Download the latest GP 2016 cumulative update and install the client only, obviously you don’t want to upgrade your GP install yet, and have no need for the database components, perhaps do this on another machine for peace of mind. You must have the release that contains the bug fix, follow the link above. It must be the 2016 version of GP, this will not work if you do this with the 2018 version!


Download for GP2016 patch


In the Dynamics GP Program files folder of the 2016 install locate the following two files :

Microsoft.Dynamics.GP.BusinessIntelligence.Office.dll

Microsoft.Dynamics.GP.BusinessIntelligence.Office.dll 
Microsoft.Dynamics.GP.BusinessIntelligence.TemplateProcessing.dll

  • Ensure Dynamics GP is closed on the machine that you are working on
  • Also find the same files in the GP2015 install folder, take a copy of them onto your desktop or somewhere safe, so you have a way to go back if you need to
  • Now replace only these two files in the Dynamics GP2015 folder with those from the GP2016 folder.

You may now launch GP2015. You should find that the ability to send emails of Word Template based reports has now been restored. Roll these two files out to your other workstations that use word template reports.

These were the version numbers on those files from the attempt I made:

version 16.0.865.0 Microsoft.Dynamics.Gp.BusinessIntelligence.TemplateProcessing.dll

version 16.0.865.0 Microsoft.Dynamics.Gp.BusinessIntelligence.TemplateProcessing.dll


Note: This is totally unsupported method and use it at your own risk. However with this just been a reporting and office library dlls, the risk to GP should be low in my opinion. Although this is an unsupported change, if you are such an old version of GP you are running unsupported already to some extent anyway! Also note that this will only work with version 2016->2015. When I tried it with a more modern version of the dll, GP failed to open.

MyTNT Missing Pending Shipment Menu item fixed

$
0
0

Pending Shipment Menu Item Missing from MyTNT web application

For some users the Pending Shipment Menu item was found to be missing when the logged into the MyTNT website. This was found to be a cross browser issue, also using another login credentials or clearing cookies and cache didn't clear the issue. It was not user rights related in the MyTNT application either. 

This was found to be linked to specific users in a particular Active Directory group policy group applied to the users. This policy was blocking internet access, except for certain domains. This policy was the immediate clue as to what might be happening. 

If you look in the image below, pending shipments does not exist in Domain User A web browser, but does for Domain User B. Also note the hamburger menu is added on B too. 

After investigating in the developer tools of the internet browser to see what page objects were failing to load, one set of requests became prime suspect -requests to https://cdn.optimizely.com/js/5435521705.js

Blocking this domain (optimizely) on a machine that was not affected, caused the pending shipments menu option to vanish after clearing caches on the previously working machine -aha! 

Thus group policy was adjusted to allow this content distribution network, and all was well again. 

 

 

 

Deploying Dynamics GP Addin Using Visual Studio Menu Item

$
0
0

I demonstrated this technique in one of the sessions I presented on GP development at the 2016 Tampa GPUG conference, but never got around to writing about it.

Often when developing Dynamics GP Addins, you may wish to deploy the current build to your local copy of Dynamics GP, perhaps for testing. This can be done manually but gets old real quick after many iterations.

I set my development environment up by creating a Visual Studio Menu item to deploy the current build of my GP Addin. This means I can simply use a hot key to deploy my code to the application and start debugging within GP or testing.

This solution copies the debug files into the application Addins folder if the build is debug, and removes them if it isn’t.

Follow this guide for what to do, the following screen shot shows the “Deploy GP Addin Locally” option we are aiming to build…

Create the Menu Item in Visual Studio

Visual Studio External Tools Menu Item

Visual Studio lets you add custom Tools to the Tools menu, select “Tools>>External Tools…”, to set this up.

Visual Studio External Tools Window

Use the “Add” Button on the form to create a new entry with the following value:

Title:Deploy GP Addin Locally
Command:powershell.exe
Arguments: -ExecutionPolicy RemoteSigned -File "$(SolutionDir)\Deployment Batch Files\DeployGPLocally.ps1" -BuildOutputFolder $(BinDir)
Use Output window:“Ticked”



Click Apply and OK on the window to preserve your changes.

The “Deploy GP Addin Locally menu option should now exist on the Tools Menu of Visual Studio but it wont work yet, not without creating the power shell script it calls.

The Menu calls the power shell interpreter (Powershell.exe),  passing the script to run and the build directory to run the script against as command arguments.

The external tools window provides some super handy environment variables you can insert into the fields. Click the little arrow boxes to the right to see the options available, useful for future reference. However in this case we take the output build location for the binary output $(BinDir) and use that folder as the basis for the script to run against. We also use the ${SolutionDir) variable to locate the powershell script, as different developers may have the source in different root directories.

Create the power shell script

Next create the script that is called from the menu item. I have a folder in the solution for this script (Deployment Batch Files), you can hack the location around to your needs changing the path as required.

Save the following into a text file with the name “DeployGPLocally.ps1" in a “Deployment Batch Files” folder located off the root folder of the solution folder for the AddIn Solution.

The script takes the build folder as a parameter, it then looks up the Dynamics GP Application location from the registry, that is then used for the destination of the copied files.

The script decides if visual studio is in a debug or release build mode, by looking for the folder “debug” in the current build path. It then copies the files appropriate to the build type, to the application Addins folder.


DeployGPLocally.ps1

param ($BuildOutputFolder)
### SCript to automate deployment of GP Addins to the installed application Addin folder during development
### Written by T. Wappat 2021-11-25
###
###	Todo - check folders exist before operation
### Todo - provide a list of registry locations for various version of GP  ###    https://timwappat.info/post/2016/12/02/Lookup-Dynamics-GP-install-location-from-registry-key
# $AddInsPath = "C:\Program Files (x86)\Microsoft Dynamics\GP2015\AddIns\"
# Fetch the location of GP application from the registry (this requires that GP installer was used to install it)
$AddInsPath = Get-ItemPropertyValue 'HKLM:\SOFTWARE\WOW6432Node\Microsoft\Business Solutions\Great Plains\v14.0\1033\DEFAULT\SETUP' 'AppPath'
$AddInsPath = "$($AddinsPath)AddIns\"
# Note the following is regular expression match
if($BuildOutputFolder -match '.*\Debug\.*' )
{
	Write-Output "Starting to deploy GP (Debug) to $AddInsPath"
	Write-Output "================================================================================================="
	Copy-Item -Path ($BuildOutputFolder + "\*.dll") -Destination $AddInsPath -Recurse -Force -Verbose
	Copy-Item -Path ($BuildOutputFolder + "\*.pdb") -Destination $AddInsPath -Recurse -Force -Verbose
	Write-Output "================================================================================================="
	Write-Output "Deployed (Debug) GP to $AddInsPath"
}
else
{
	Write-Output "Starting to deploy GP (Release) to $AddInsPath"  Write-Output "================================================================================================="
	Copy-Item -Path ($BuildOutputFolder + "\*.dll") -Destination $AddInsPath -Recurse -Force -Verbose
	Get-ChildItem -Path $AddInsPath *.pdb | foreach { Remove-Item -Path $_.FullName -Verbose }
	Write-Output "================================================================================================="
	Write-Output "Deployed (Release) GP to $AddInsPath"
}

Dynamics GP–syWindowDefaults cannot find the table SY01407

$
0
0

A get/change operation on table ‘syWindowDefaults’ cannot find the table.

After upgrading a Dynamics GP system from GP2015 => GP2016 => GP(18.4) the following error occurred on opening some enquiry windows in GP;

A get/change operation on table ‘syWindowDefaults’ cannot find the table.

Invalid object name ‘SY01407’

Invalid object name ‘SY01407’


The above window is used for the “The Default Inquiry Sort options for Payables, Receivables and Bank Reconciliation”, a feature introduced in 18.4 version of Dynamics GP.

According to the linked documentation, this feature saves some of the field settings and window sizes for windows:

     Receivables Transaction Inquiry - Customer  (Inquiry > Sales > Transaction by Customer)

     Receivables Transaction Inquiry - Document  (Inquiry > Sales > Transaction by Document)

     Payables Transaction Inquiry - Vendor  (Inquiry  > Purchasing > Transaction by Vendor)

     Payables Transaction Inquiry - Document  (Inquiry > Purchasing > Transaction by Document)

     Checkbook Register Inquiry  (Inquiry > Financial > Checkbook Register)

     Checkbook Balance Inquiry  (Inquiry > Financial > Checkbook Balance)

     Sales Order Processing Item Inquiry  (Inquiry > Sales > Sales Items)


The information for the feature is stored in the following table:

Dynamics GP table name: SY01407
Dynamics GP Technical Name: syWindowDefaults

Investigation

Checking in the database to see if the table was in SQL server, proved it was not:

select * from DYNAMICS..SY01407

Msg 208, Level 16, State 1, Line 1
Invalid object name 'DYNAMICS..SY01407'.

So it would seem this table got missed when the GP installer/upgrade scripts were created!

There is evidence of other people having similar experience on the forums, Inquiry error message missing table SY01407


Creating building table SY01407 and its associated database objects

Being a new table it is easy to build as you don’t need to worry about pre-existing data. Do so using the Dynamics GP SQL Maintenance window as user sa.

Microsoft Dynamics GP>>Maintenance>>SQL

Select Database DYNAMICS

Product Microsoft Dynamics GP

Locate the table in the list (“System Window Defaults”) and select it (you will see 1 object selected as shown below)

Select the checkboxes to “Create Table” and “Create Auto Procedure

SQL Maintenance Window Dynamics GP

Then select Process button, the missing table will be created along with some stored procedures and now the error messages should stop occurring as the defaults feature should start to function.

Thanks to Beat Bucher for confirming the issue for me and confirming I could use the above window to build the missing database object.

He noted there is also a script you can obtain from GP support if you raise a ticket, that effectively does the same thing as the GP client does above.

Resulting database objects created

table:
[dbo].[SY01407]

stored procedures:
[dbo].[zDP_SY01407F_1]

[dbo].[zDP_SY01407L_1]

[dbo].[zDP_SY01407N_1]

[dbo].[zDP_SY01407SD]

[dbo].[zDP_SY01407SI]

[dbo].[zDP_SY01407SS_1]

Important Note:

Although this will get you running, it may be a sign of other issues with the upgrade. It is always important to run a dummy upgrade and perform User Acceptance Testing on it to ensure everything is ok before doing the upgrade of the production environment.

Dynamics GP Focus trigger registration failed: taUpdateCosts after installing Shipping Notification Tool 2013 on GP18.4

$
0
0

Focus trigger registration failed: taUpdateCosts

After installing the Shipping Notification Tool on Dynamics GP 18.4 the error “Focus trigger registration failed: taUpdateCosts” is shown after launching Dynamics GP.

I guess some Dexterity changes were made to the forms that the tool is plugging into, causing the Dex trigger to fail like this.

Luckily Harry Lee has already investigated this issue on the community forum post Focus trigger registration failed: taUpdateCosts after Shipment Notification Installed and Configured.

 There is a another build of the tool "Shipment Notification Tool version 2018" that fixes the issue available from Microsoft Support, under a free support ticket.

After dropping the new .cnk file into the GP application, this new build fixes the issue.

 If you already have the product installed as this is an upgrade, do not run the taShipmentnotification_Create.sql script, as it drops your table,. I did a file comparison on the previous version and this version and they are identical, so no schema changes in the tool has were made. 

Note:

There is a lack of information in the partner channel about the Shipping Notification Tool, it will often be thought to be part of the Professional Services Tool Kit (PSTL). This is not the case, the PSTL is now part of main media installer for GP and the shipping notification tool is separate cnk available only from MS support via a support case. 

Dynamics GP Smart List Export to Excel found a problem with content, repaired records

$
0
0

After moving to from Office 13 to 365 the following error appears when users export from Dynamics GP smart lists to Excel.

We found a problem with some content in 202222-1232S9.XISX. Do you want us to try to recover as much as we can? If you trust the source of this workbook, cick yes

Found a problem with some content in Do you want to try to recover as much as we can?

Followed by

Excel was able to open the file by repairing or removing the unreadable content.
Repaired Records Format from /xl/styles.xml part (Styles)

Repaired Records: Format from /xl/styles.xml part (Styles)

 

The file generated as a log says pretty much the same thing

<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<recoveryLog xmlns="http://schemas.openxmlformats.org/spreadsheetml/2006/main">
     <logFileName>error491840_01.xml</logFileName>
     <summary>Errors were detected in file ’C:\Users\***\AppData\Local\Temp\202222-123259.XLSX’</summary>
     <repairedRecords>
         <repairedRecord>Repaired Records: Format from /xl/styles.xml part (Styles)</repairedRecord>
     </repairedRecords>
</recoveryLog>

 

By adding the following line somewhere in the Dex.Ini file (normally in the Data subfolder of the application folder of GP), resolved the issue in this case.

SmartlistEnhancedExcelExport=TRUE

This ini switch is not compatible with the GP web client if you are using that.

References:

Undocumented DEX.INI switch cuts down SmartList export times to Microsoft Office Excel

Smartlist: Exports slowly to Excel – Part 1

Dynamics GP Install/upgrade failed SyCompanyImages

$
0
0

install upgrade failed syCojmpanyImages

Interesting issue with upgrading Microsoft Dynamics GP.

The upgrade stopped at the above step

Microsoft Dynamics GP Utilities install/upgrade failed
syCompanyImages


Turns out this was linked to deleting companies from this instance of GP, and the delete leaving orphaned records in the DYNAMICS..SyCompanyImages table.

The syComanyImages table holds the images used by word template reports, that are uploaded using the form below:

Reports>>Template Configuration>>[Images] button

Company Images Window Dynamics GP

There was an orphaned record from the deleted company in the table that is used to back that form. The extra record was removed using the following script:

DELETE FROM Dynamics.dbo.syCompanyImages
WHERE (RELID/65536) NOT IN
(
   SELECT  (CMPANYID) FROM  Dynamics.dbo.SY01500 )
)


Note that I don’t understand how this table works entirely, other than its holding a binary blob of the image, that is referenced back in this bizarre /65536 formula. I would check what records will be deleted before actually running this script.

Note that It is important to have ran ClearCompanys.sql - Script that will clear out all entrys in the DYNAMICS database referencing databases that no longer exist on the SQL Server, BEFORE running the above script. This is to make sure that SY01500 only has companies that have databases associated with them.


Dynamics GP folder path incorrect when using Create Installation Package utility

$
0
0

GP Create Installation Package Menu

Application folder for Dynamics GP moved

The default installation package location moved for the Dynamics GP application to

C:\Program Files (x86)\Microsoft Dynamics\GP

for releases after GP2018. Previously it was in C:\Program Files (x86)\Microsoft Dynamics\GP2018, where the GP2018 was replaced with the version of GP. As the versions are no longer linked to years this was dropped as a folder naming pattern.


Create installation package utility

The Create Installation Package utility, found on the GP installation media main menu, did not get this memo about the folder change, or it introduced a bug to be more accurate. We find it broken since this change was made. After creating an installation package, and then using that installation package to install GP, the folder is not “C:\Program Files (x86)\Microsoft Dynamics\GP”, instead it is “C:\Program Files (x86)\Microsoft Dynamics\GP2018”, as it was in the previous version.

I’ve just tested this with the GP2019 fall release, and that release (V18.4) still will install the application into the “wrong folder”.

When the package is built we are prompted for the folder location:

Dynamics GP Installation Package folder C:\Program Files (x86)\Microsoft Dynamics\GP2018


As can be seen the location is correct (C:\Program Files (x86)\Microsoft Dynamics\GP\).

What makes things worse it reports that if you specify the local location for the dictionary, the resulting launch file (.SET) will then happily mix the file locations, resulting in mixture of folders in the .SET file for GP and GP2018!

Report DIC and Forms DIC locations

The fix/workaround

I stumbled on a fix or work around on Ian Grieve blog with this comment at the bottom of the hands on guide to using the Create Installation Package utility (which if you are not familiar with this, then that is a good guide to go by:

2022-02-23_15-51-42

Hands On with Microsoft Dynamics GP Fall 2021 Release: Create Installation Package


So if we try this by removing that trailing slash as shown below, then indeed the application ends up, correctly in the GP folder rather than the GP2018 folder! I have no idea how whoever it was discovered this but well done!

remove trailing slash from install path

See how the slash has been removed, it works!

Task to think about when migrating SQL server for Dynamics GP to a new server

$
0
0

After performing SQL migration I had this top level list of things I had to think about, so this is just a private list put here of things that you might need to think about when migrating from one SQL server to another. This is for my environment, if you use clustering and other features of SQL server, then you will have some more to add to the list. However hopefully this will help me next time and may provoke some thoughts to others working on a similar project.


In this case reporting services is co-hosted on the server as is eConnect service and some IIS API end points.
They are in random order, there are many interdependencies.

  • Linked Servers, Script out any linked SQL servers and re-create on the new server.
  • SQL Jobs needs scripting out and re-creating on new server. Any reporting services jobs need removing and let reporting services rebuild the schedules when it the service starts. References to server names needs checking within the scripts.
  • Maintenance plans need moving to the new server.
  • SQL Logins needs scripting out and moving to the new server.
  • TempDB files need creating with appropriate number of data files for the server and run on appropriate disks.
  • Replication needs scripting, removing, then rebuilding on the new server. Change server names in replication as has to use the actual server names not aliases.
  • The user databases needs need backup from old server and restore to new server.
  • Reporting services, if installed needs keys applying and configuring with new host names in .config file, restore the encryption keys from the old server. 
  • Back up devices needs scripting out and recreating on the new server.
  • eConnect Service needs installing on the new server.
  • eConnect Config for service needs updating for changes to port numbers and transport protocol used (if not using defaults).
  • Any IIS web services needs migrating to new server (if any installed).
  • Script out extended events and recreate on new server.
  • SSIS packages need exporting and importing to new sever.
  • SQL Operators need scripting out and moving to new server.
  • After migration switch DNS aliases to point at new server from old (including reporting services and any iis site alias if applicable).
  • Check Database Collations as set correctly for new sever vs old.
  • Compare side by side new and old SQL server settings pages for all settings.
  • Start SQL server Jobs after migration and once old server is out.
  • Check virtual machine settings after migration to ensure full resources have been restored to the virtual machine.
  • Ensure Full text index is installed (if required).
  • Ensure SSIS is installed (if required).
  • Ensure Reporting Services installed (if required).
  • Install VMWare SCSI drivers for performance reasons.
  • Check firewall settings on machine are the same between old and new, including application settings. 
  • Duplicate any windows file shares on the new server to be the same as the old server, including share and NTFS permissions.
  • Double check any local users and user permissions.
  • Install Dynamics GP workflow server CLR objects on the server using stored procedure DYNAMICS..wfDeployClrAssemblies
  • Migrate mail profiles from old to new server.

Dynamics GP Payables - A unique document number could not be found Please check setup

$
0
0

Payment run failed after GP upgrade

On upgrading to GP: Version 18.4.1361 (2021) everything worked great except for making EFT payments.

On printing the EFT payments, the following error popped up.

A unique document number could not be found. Please check setup.

A Unique Document Number Could Not Be Found. Please check setup.


Although it could be clicked through the first time, attempting further subsequent EFT payments resulted in another message box.

An error occurred. Contact your system administrator to verify data.

An Error Occurred. Contact your system administrator to verify data.

This was annoying as this prevented any further EFT payments, although manual payments were absolutely fine.


Diagnosing the problem

I posted the issue on the community forum and lodged a ticked with our support partner, just in case this couldn’t be resolved ourselves… then time to dig in and investigate as not paying suppliers is a big issue…

This was a familiar message to me from other modules of GP, where the number sequence has gone so far out of range that GP fails to scan for a new number, or the number sequence has not been defined– this experience helped me later narrow in on the problem. I checked the purchasing setup window and the various GP tables to find that the next numbers for both that chequebook and the payables transactions. All were all looking correct.

hmmm…

It felt like something was wrong when GP was getting the next chequebook number (or perhaps payment transaction number, though it felt less likely), you get a gut feeling on these things when working with a product long enough!

Next action was to capture a SQL debug trace and a Dexterity Script debug trace around that operation. The SQL trace had nothing interesting in it. However the script trace did. This showed the error box was being called after function “GenerateNextEFTNumber” was executed.

'GenerateNextEFTNumber of form PM_Print_Computer_Checks', table 'CM_Checkbook_MSTR', "0000000000000001012343295", 0

           'PM_Number_Inc_Dec', "", 1

'SaveCurrentFormTrigger', 2567, 0, "SY_Error_Message"

EFT Number, what is that?? – I’m not that familiar with the EFT module in GP, so I went looking around the configuration for EFT, which hangs off the chequebook form.

I found in the EFT options here: Cards->Financial->Chequebook->EFT Bank->Payables Options

EFT Setup - Payable Optoins shiowing use cheque numbers and empty field for next number


In there it states option is to use the cheque number, but it looks like it is possible rather than use up cheque numbers, to use a unique EFT sequence number for EFT payments. This was the moment that I felt that I knew what was happening.  You may see from the screen shot that there is no EFT number in the Next EFT Payment Number field. On the test environment I seeded this value with EFT000000000001, having also set the option to “Use EFT Numbers”. This time the payment processed as expected and the “Next EFT Payment Number” incremented by the number of payments.

Aha! – I set the option back to “Use Cheque Numbers” but leaving the Next EFT Payment Number populated and it still allowed payments to be processed. It looks like the blank field was the problem.

Fix Production environment

I moved to production environment and ran

UPDATE CM00101 SET EFTPMNextNumber = 'EFT000000000001'

that will seed the Next EFT Payment Number for all the chequebooks (make sure this is ok in your environment and do for each company DB).


The setup options for EFT now looked like this below, with a value in the next number field but option still to use cheque numbers…

EFT Options with corrected setup

On attempting to do the payments run, it worked just as it did pre-upgrade. –yay!


Conclusion

Conclusion is that even if the option is set to “Use Cheque Numbers”, then GP will still attempt to generate EFT numbers, and thus needs the field for Next EFT Payment Number to be populated. It is not so much “used” but is “required”.

Also out of curiosity went and created a new chequebook. On opening the EFT options window of this new chequebook, the “Next EFT Payment Number” field is pre-populated by the application with “EFT000000000001” – thus on a current versions of GP this issue would not happen. There would always be a value in that field. I guess in the past versions of GP that this field was not auto populated by the application, thus our chequebooks ended up with blank fields, that only then became an issue after upgrading to this latest version of GP.

It was a stressful few hours but outcome was good.

Dynamics GP–The Document is in use and could not be deleted, when attaching a document

$
0
0


When using document attach in Dynamics GP from Version 2016, it is not possible to attach a document if it is open in another application.

If a user has a document open and then attempts to attach it, the error is:

The document is in use and could not be attached.

DocAttachError

I think this is a positive change, although it can require users to change working practices, just a little, to accommodate it.

I have the feeling this is to prevent data loss. If user attaching a document, such as word or excel, that is both open and has unsaved changes in it, then they mistakenly may believe they have safely stored the document in the document attachment. Unfortunately they may not realise the outstanding changes in the open document will not have been in the version they captured to GP, from the file system. Those changes could get lost quite easily (machine restarts or user saves to file system not refreshing the attached version).

Whilst protecting users from themselves, it does pose the problem that if a user has preview pane of file explorer open, or PDF viewer open looking at a PDF, then the filesystem will know the file is locked, and prevent them attaching the documents. For end users this can seem odd and frustrating behaviour, although obvious to a IT professional why this is so. With a little change to workflows and practices (user training), this can be overcome.

A common example seems to be AP departments scanning and attaching AP invoice PDFs. The document scanner opens the PDF automatically, in PDF viewer immediately after scanning, blocking its ability to be attached in GP. Users just need to learn to close documents and turn off previewers to work effectively again.

Another user is editing this item–Dynamics GP error

$
0
0

Another User Is Editing This Item

You can’t delete an item because another user is editing this item message pops up.

This is due to a lock in the SY_ResourceActivity table (DYNAMICS..SY00801).


Check the table for the item number in RSRCID field, you can list all the records locked in the system with following SQL statement…

SELECT * FROM DYNAMICS..SY00801


DYNAMICS_SY00801

To resolve, remove the rows that are for the item in question.

DELETE FROM DYNAMICS..SY00801 where rsrcid='{insert item number}'

You should of course check the user is not editing the item, in this support case, the user themselves could not delete the item due to a system crash during a previous edit.

An Unknown Error Occurred sending mail in Dynamics GP with Office 365

$
0
0

Attempting to send email from Dynamics GP using multi factor authentication (MFA) results in an error, after selecting a user in the Office authentication window.

An Unknown Error Occurred

An Unknown Error Ocurred - Dynamics GP Email

There is no explanation as to what the error actually was. In the case I was investigating some users could email and yet others could not. This is strange when the machines that they were using originate from the same base image.

 

After logging is turned on, it turns out the error underneath is that the following file does not exist:

c:\Users\{userID}\AppData\Roaming\Microsoft Business Solutions\Microsoft Dynamics GP\Microsoft.Dynamics.GP.BusinessIntelligence.TemplateProcessing.dll.msalcache.bin3

 

GP looks to be using the Microsoft Authentication Library (MSAL) to cache the Azure authentication tokens for the connection to Office 365 into this folder and file. However, if the folder does not exist then the library can not write the cache file . I looks like this folder will only be created if the user in GP has had the field auto complete feature in GP turned on, or has had it in use at some point in the past.

Details of Auto complete functionality is found here: Using AutoComplete functionality with Microsoft Dynamics GP

This feature also uses this folder to store the database files that drive the auto complete functionality.

It is controlled from the User Preference window.

Autocomplete setup window in GP

Comparing a user that can send email against one that can’t using SQL

Settings

We can see one has auto complete on the other has it off and also checking the existence of the folder shows that the user encountering the error does not have the folder.

So after creating the following folder,

c:\Users\{userID}\AppData\Roaming\Microsoft Business Solutions\Microsoft Dynamics GP\

...it was found that email functionality will work correctly. I assume the developer of the email feature, assumed that this folder will always exists on all systems. They were unaware it may not be created if autocomplete is turned off. My guess is that code will not create the folder when it does not exist and hence the error.

Solution:

Creation of this directory has been added into group policy for GP users. 

Hope this helps someone else, comment if you found it helpful.

Dynamics GP Addins and Visual Studio 2022 error adding new form

$
0
0

“The designer could not be shown for this file because none of the classes within it can be designed. The designer inspected the following classes in the file: DynamicsGPForm1—The base class ‘Microsoft.Dexterity.Shell.DexUIForm’ could not be loaded. Ensure the assembly has been referenced and that all projects have been built.”

When using the visual studio add in template extension for Dynamics GP with Visual Studio 2022, you may experience the above error. This is noted in the following community posts: 

Building an application with VS 2022 fails when attempting to add Dynamics GP form.

Microsoft.Dexterity.Shell.DexUIForm

 

 

There was a major change with the introduction of Visual Studio 2022, a departure from previous versions of Visual Studio. Visual Studio went fully 64bit in this release, now running as a 64bit application. This means when a user is working inside Visual Studio, everything in the IDE is running as a 64bit application, this is a push towards the general adoption of 64bit as the new standard normal. This is all great except for the fact that Dynamics GP is a 32bit application, and is not likely to become 64bit anytime soon. 

When developing an Add-In for Dynamics GP, the various Dynamics GP libraries that are referenced are Dynamics GP 32bit libraries. When a GP form is added to the solution the above error will occur as the GP form would normally be ran by Visual Studio, in design time in order to move controls and drop them onto the form visually (visual designer). However Visual Studio is running as 64bit and it is going to attempt to load 32bit code, this does not work. Hence the above error is raised by the designer. 

This issue does not just happen for GP users, developers wishing to consume any 32 bit library or ActiveX/COM/OCX components will get the same issue. There were a number of "bug" reports around this after the launch of Visual Studio 2022, and these have now been consolidated down onto the following bug report. To be fair on Microsoft this is not really a bug, as this is the intended behaviour. WinForms .NET Framework Projects can't display the designer for 32bit references I would highly recommend reading this thread to get an understanding of the change and see the desperation of other developers on other products for a solution to this issue. 

The "workaround"

In the thread WinForms .NET Framework Projects can't display the designer for 32bit references the work around is given, which is to make a project build target of It works for me by creating a separate build configuration for the solution and each project targeting Any CPU, then clean/rebuild. This will allow the designer to work, then before building the dll to deploy to GP, change the project target back to 32bit, after any form design work is complete (close any forms that are open). There is also the suggestion to simply not use Visual Studio 2022 for 32bit Winform design, instead keep using Visual Studio 2019. However 2019 tries to push you onto 2022, with messages like,  ‘Please update to Visual Studio 2022, we are not supporting Visual Studio 2019 any longer.’, so who knows how long this is going to be practical as a solution. 
There was also talk on the bug thread about using the Visual Studio Options>>Environment>>Preview Features>> Use the preview Windows Forms out-of-process designer for .NET apps option. I have had no success selecting this option and don't think this is applicable to Dynamics GP development. 

Using Build>> Configuration Manager in Visual Studio, Use the <New..> option to create a new configuration for x86 and Any CPU, Copy settings from Any CPU or x86. It should end up with two options under both "Debug" and "Release" solution configurations. One should be x86 and the other Any CPU, as shown below. 

Select Any CPU and clean build to do any design work, then before deployment or debugging with GP itself, it must be switched back to x86 so we get a 32bit application complied. 

Below is the guts of the matter from the referenced thread above, in case it should vanish from the internet... 

Visual Studio 2022 and WinForms 32-bit .NET Framework components

Visual Studio 2022 is the first Visual Studio that runs as a 64-bit application. We received lots of requests to make this change and now you can open, edit, run, and debug the biggest and most complex solutions without running out of memory. This switch required fundamental architectural changes and there were some scenarios related to .NET Framework 32-bit components that we could not make work in Windows Forms designer.

In order to support .NET Core 3.1 and above we had to create an out of process designer. For .NET Framework, the designer remained in-process. The reason for that decision was to minimize breaking changes for 3-rd party controls that interact with Visual Studio. Since Visual Studio is now 64-bit, the .NET Framework-based designer now also runs in 64-bit, which means 32-bit code cannot be loaded at design time any longer. At the same time, to load controls for .NET Core, we created a new designer that runs out of process. Since that process can run as either a 32-bit or 64-bit process, it can be used with 32-bit controls . . So, for .NET 6, .NET 5, and .NET Core applications you have full designer support . But if you have a .NET Framework application that is referencing a 32-bit component, this component cannot be loaded in Visual Studio 2022 Windows Forms designer, and you will see a white error screen instead. This will not affect your end users, as your application will run just fine, it only affects the designer experience inside of Visual Studio.

In the future, we plan to add support for .NET Framework projects to the out of process designer. Once that support is available, it will be possible to load forms that have 32-bit specific dependencies in the designer.

Meanwhile there are a few ways to make your Forms show in the designer:

  • You can simply keep using Visual Studio 2019 for forms that have references to 32-bit components until we add support for .NET Framework projects to the out of process designer.
  • If you are an author of the component you are using, rebuild it for ‘x64’ for native components or ‘Any CPU’ for managed code. For managed code, the procedure should be very straight forward.
  • If a 32-bit component cannot be rebuilt, you can create a “design” configuration of the project, which targets 64-bits and where the 32-bit component is replaced with a 64-bit placeholder. This particular component will not be available at design time but the rest of the UI will be.
  • If you are using a third-party component, reach out to the author and ask them to provide an ‘x64’ or ‘Any CPU’-version of it.
  • For some cases, such as Visual Basic 6, producing 64-bit is not an option because they are legacy controls that don’t have an implementation on 64-bit CPU. We would suggest finding a .NET substitute for the component that provides similar functionality.

Then...

We hear and understand your concerns regarding the 32-bit references. Due to the way the Framework WinForms Designer was created, the form you are designing runs in the same process as Visual Studio runs in. Unfortunately, with the move to 64bit it meant that references now must be of an architecture that x64 can work with (AnyCPU or 64-bit). If you have access to the source code of the original projects the key is to design them with the reference as AnyCPU, even if you ultimately build and release a 32bit version of the reference.

The only way for us to fix this in WinForms is a rearchitecting of the designer entirely to run out of the process of Visual Studio - and if it were to do that, you would be able to design with your references in whatever architecture suited your needs. Unfortunately moving the designer out of process is far more than a bug fix; it’s an entire re-architecting of how the design surface in one process communicates with the Visual Studio process and passes the information back and forth about control properties. We have been working on creating an out-of-process WinForms Designer for .NET applications, and we’re getting pretty happy with the user experience in this last VS release. That said, there is still more work to be done to support .NET Framework projects and it doesn’t preclude the need to do something with the references that are causing you trouble now. They may need to be built against this new architecture for full support in the out of process designer.

We’re working hard on investigating what it will take to move the Framework designer out of process like the .NET WinForms Designer, and I will circle back on this thread with more information as I have it. I’m unable to provide a timeline yet, but once we’ve completed our investigations and done some prototype work we should have a better idea about the timeline of the project. Stay tuned for a WinForms blog post early in the new year which will give you a little insight into how we got an out of process designer to work, and future blog posts about the details of the new architecture.

 

 


Viewing all 195 articles
Browse latest View live