Quantcast
Channel: Dynamic Code Blocks
Viewing all 195 articles
Browse latest View live

Microsoft Dynamics GP–United Kingdom, Making Tax Digital (VAT) for business MTD / MTSfb

$
0
0

Making Tax Digital is a Her Majesty's Revenue and Customs (HMRC) introducing a mandatory electronic filing of VAT (tax) returns by business by using an application that supports their application programming interface (API). VAT records must be maintained and submitted digitally after April 2019. HMRC are encouraging software developers to engage with them to create applications to interoperate with their APIs HMRC Blog - Digital Relationship Management – A new era in 3rd party software collaboration. Xero and other software companies have been working on the private Beta of the APIs for some time, but what about Microsoft?

HRMC - Overview of Making Tax Digital

HMRC Overview of Making Tax Digital

 

Dynamics GP, SMB products and Making Tax Digital VAT submission MTD

Microsoft Dynamics products AX, NAV, 365 Finance & Operations will all enable customers to be compliant,  the details of how is yet to become clear, perhaps bridging software?

Microsoft Dynamics community- Are you using our accounting software Dynamics NAV or Dynamics 365 Business Central and worried about compliance with Making Tax Digital?

Did you know the Microsoft SMB ERP products also include Dynamics GP too? -Dynamics GP already allows VAT to be recorded electronically and even has the VAT daybook functionally. You would think it not too great a leap to create a little helper application that could pluck the records from the database and submit them via the HMRC web API. It is disappointing that in April 2018 all we have from Microsoft is this page for users of GP.

VAT 2019 digital regulations for Microsoft Dynamics GP

that points to the suggestion list for new features for GP…

Suggestion: GP needs VAT electronic filing to be compliant with Making Tax Digital initiative

So one may think from this that there are no plans in action at the moment for it to be supported, on the other hand you could take the view this is Microsoft acknowledging the requirement, perhaps Terry may well have had her ear bent by the UK Dynamics partners thus creating this page? Admittedly it is a year away and the MS developers can work quite quickly when they need to. Also to be fair this product suggestion site is the funnel through which new features/requirements should be captured, so perhaps it is in hand?

However it would be nice to know soon if UK customers need to find another 3rd party bridging software to the HMRC APIs soon so we can start answering our Finance Directors and Controllers questions about what the plan is for 2019 (and yes that question sparked this blog post).

 

Spreadsheets

What is amusing is that HMRC have announced that spreadsheets are acceptable when used with bridging software to the APIs – heaven knows what accountants would do without them!

“Businesses will be able to continue to use spreadsheets for record keeping, but they must ensure that their spreadsheet meets the necessary requirements of Making Tax Digital for Businesses. This is likely to involve combining the spreadsheet with software.”

Accounting web- Spreadsheets for Making Tax Digital – The universal language of accounting

 

I will update this post if there are further developments over the year.


.NET 4.6.1 or 4.6.2 seem to break IsInRole()

$
0
0

Upgraded application using IsInRole(), now only returns false (vb.net)

Read to the end before making code changes as there is a more obvious thing to check!
 
To support TLS1.2 for PCI requirements I was upgrading one of the applications to 4.6.1, after deployment behaviour controlled by our active directory groups was broken. It was like no one was a member of any AD groups anymore. First I thought it must be a coincidental screw up by someone in AD. It turns out it was something else…
 
The following code is used to check against a list of security groups to see if the current user belongs to any of them.
 
PublicSharedFunction IsInAdSecurityRole(RoleName() AsString) AsBoolean
Dim aName AsString = Principal.WindowsIdentity.GetCurrent.Name
Dim aDomain AsString = aName.Substring(0, aName.IndexOf("\") + 1)
AppDomain.CurrentDomain.SetPrincipalPolicy(
Principal.PrincipalPolicy.WindowsPrincipal)
For index = 0 To RoleName.Count - 1
If Thread.CurrentPrincipal.IsInRole(aDomain & RoleName(index)) Then
ReturnTrue
EndIf
Next
ReturnFalse
End Function

The code is ancient, has been in our applications for a very long time but on upgrading to .NET framework 4.6.1 it returns false for all roles. Checked casing and ran in debug inspection and yet failed to see why it stopped behaving as it always had before.
 
Unable to figure out what had happened and with a need to get systems running again I imported the namespace System.Security.Principal
then using the following method all seems well again.
 
PublicSharedFunction IsInAdSecurityRole(RoleName() AsString) AsBoolean
Dim currPrincipal AsNew WindowsPrincipal(New WindowsIdentity(Environment.UserName))
For index = 0 To RoleName.Count - 1
If currPrincipal.IsInRole(RoleName(index)) Then
ReturnTrue
EndIf
Next
ReturnFalse
End Function

I used this reference:

My.User.IsInRole() is not working after migrating to 4.6.2 framework in vb.net

 

Authentication mode in project settings Application-defined vs Windows

VB.NET has a setting in the project to say you wish to use application provided authentication method or use the default windows one. This was something that I had totally forgotten existed. It looks like the Authentication mode of the project got changed during the migration. Check the properties of the project, Authentication mode, see if changing it from Application-defined to Windows helps, it did in my case, bringing behaviour back to that which is expected.

 

2018-06-14_12-14-50

Change the drop down combo box to “Windows”

2018-06-14_12-23-38

 

Reference: https://social.msdn.microsoft.com/Forums/en-US/d00b65dd-61d8-4368-b2d2-eaedfc66af40/myusername-is-now-returning-empty-string?forum=vbgeneral

Dynamics GP Font Size

$
0
0

Increasing font size for GP users with poor eyesight

Increase GP font sizes

Today one of my co-workers from the infrastructure team asked me to include a “magic file” into our standard GP deployment. The file is activated via a registry key on the user’s machine and provides them with a bigger font experience when using Dynamics GP in conjunction with font scaling on the windows operating system.

Font scale in windows

Font scaling in windows is found by typing into the start menu “scaling”…

 

Make everything bigger in Windows 10

Then you can scale up the system fonts…

Font scale in windows 10

 

Font Scale in GP

To make this sit well with Dynamics GP you need to apply this technique.

The required file is named “Dynamics.exe.manifest” and is dropped into the program files directory side by side with the application.  The contents of the manifest file are below:

<?xmlversion="1.0"encoding="UTF-8"standalone="yes"?>
<assemblyxmlns="urn:schemas-microsoft-com:asm.v1"manifestVersion="1.0"xmlns:asmv3="urn:schemas-microsoft-com:asm.v3">
<dependency>
<dependentAssembly>
<assemblyIdentitytype="win32"name="Microsoft.Windows.Common-Controls"version="6.0.0.0"processorArchitecture="*"publicKeyToken="6595b64144ccf1df"language="*">
</assemblyIdentity>
</dependentAssembly>
</dependency>
<dependency>
<dependentAssembly>
<assemblyIdentitytype="win32"name="Microsoft.VC90.CRT"version="9.0.21022.8"processorArchitecture="amd64"publicKeyToken="1fc8b3b9a1e18e3b">
</assemblyIdentity>
</dependentAssembly>
</dependency>
<trustInfoxmlns="urn:schemas-microsoft-com:asm.v3">
<security>
<requestedPrivileges>
<requestedExecutionLevellevel="asInvoker"uiAccess="false"/>
</requestedPrivileges>
</security>
</trustInfo>
<asmv3:application>
<asmv3:windowsSettingsxmlns="http://schemas.microsoft.com/SMI/2005/WindowsSettings">
<ms_windowsSettings:dpiAwarexmlns:ms_windowsSettings="http://schemas.microsoft.com/SMI/2005/WindowsSettings">false</ms_windowsSettings:dpiAware>
</asmv3:windowsSettings>
</asmv3:application>
</assembly>

 

The manifest file is “activated” and the bigger font experience is enabled using a registry key change on the user machine:

[HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows\CurrentVersion\SideBySide] "PreferExternalManifest"=dword:00000001

 

Thus the GP fonts are scaled up for the users that are using font scaling on the operating system.

 

It was also brought to my attention that newer builds of Windows 10 have a whole subsystem that is attempting to grapple with the DPI scaling issues. I would assume this is a UI for the same system that the manifest file is activating?

Right click on an executable and go down the rabbit hole…

Windows 10 Advanced DPI compatability menu

 

 

Reference:

After being shown this, I realised I’d seen this “hack” before before and remembered it was from Steve Endow’s blog:

Fix Dynamics GP scaling and font size issues on high DPI displays

- where in this case he was trying to solve a slightly different problem with hi DPI displays, so thanks to Steve our users are benefitting every day from a more comfortable ERP experience.

Windows scaling issues for high-DPI devices

 

 

If you found this helpful do comment, it motivates me to blog more!

Converting Dynamics GP SOP Master numbers to file system folder hierarchy

$
0
0

There are a number of cases where I need to store a document or documents against an “object” in GP for the various system integrations and modifications. This can be problematic, even more so with the latest update to Windows 10 that seemed to seriously detrimentally affect the performance of folders with large numbers of files or subfolders in them. Take the SOP document in GP. SOP documents are threaded together via a relationship called the master number. This master number relates all documents together between quote, order, invoice, return.

It is appropriate in some cases to store files against those master numbers. This could be done in a flat structure where a documents folder contains a sub folder for each master number and in side those folders lie the documents we wish to preserve. This starts out great, however by the time you end up with 500,000 sales master numbers and documents, opening the network store folder starts taking up to three to four minutes.  Although anything over a few thousand files or subfolders in a folder has always incurred a performance hit, with the new update this seemed to get a whole lot worse (it felt like a caching issue as subsequent re-opening of folder happened instantaneously).

Storing all those folders in a single folder is not a good user experience either, it is overwhelming to try and find a folder of interest among all those folders. A much better way is to store the folders in a hierarchy so there are never more than 1000 folders in each sub folder. This keeps the file system happy and quick.

 

So how to migrate the flat structure to hierarchy?

 

I love powershell and wish I could script in it more fluently than I can. Powershell seems to have the power to address any mundane task to do with windows servers and clients. Using power shell it is possible to move the flat structure into a new root directory, building it up into a hierarchy. My preference is for each folder to intuitively orientate the user as to where they are without having to digest/process too much of the path information.

There are lots of options for a directory arrangement scheme, but I like to do as follows, for a master number of 13224223, the folders can be organised like this:

\\SalesArchive\13220000\13224000\13224200\13224223\

This then necessitates a way to transform the number 13224223 to this directory arrangement. This is something regular expressions can help with, I’ve been using this technique for many, many years now.

Regular expression:

([0-9a-zA-Z]*)([0-9a-zA-Z]{1})([0-9a-zA-Z]{1})([0-9a-zA-Z]{2})$

Replacement expression:

${1}0000\${1}${2}000\${1}${2}${3}00

So integrating this into our powershell with optional date limit we get the following…

$sourcePath = "\\SalesArchive\"
$destPath = "\\SalesArchive\New\"
Write-Debug($destPath)
Get-ChildItem "$sourcePath\*" |? {$_.psiscontainer -and $_.lastwritetime -le (get-date).adddays(0)} |%{
if($_.Name -ne "New" -and $_.Name -ne "New2"){
$newFolder = ($destPath + "\" + ($_.Name -replace '([0-9a-zA-Z]*)([0-9a-zA-Z]{1})([0-9a-zA-Z]{1})([0-9a-zA-Z]{2})$','${1}0000\${1}${2}000\${1}${2}${3}00'));

New-Item -Path $newFolder -Type Directory -Force
Move-Item $_ $newFolder
Write-Host "source: $sourcepath dest: $newFolder"
}
}

This will move the folders into the new shape for us.

In .NET we can also use the same regular expression to open the folder or fetch/save files but using the .NET regular expression library something like this:


Return RootFolder & System.Text.RegularExpressions.Regex.Replace(DocumentMasterNumber,
"([0-9a-zA-Z]*)([0-9a-zA-Z]{1})([0-9a-zA-Z]{1})([0-9a-zA-Z]{2})$",
"${1}0000\${1}${2}000\${1}${2}${3}00")

Working things this way lets users access the folders instantly again and makes manual navigation of the folder structure possible, should it be needed.

Printing to multiple UPS label printers (from Dynamics GP)

$
0
0

This is a quick note to self:

When an order is fulfilled it is routed electronically to the appropriate carrier depending on nature of the consignment.

The various carriers we use then print the thermal shipping label directly to the warehouse packing bench that is fulfilling that order. With one central thermal label printer an many benches, it is both ergonomically inefficient and too error prone (wrong labels get on wrong parcels).

 

UPS world ship

UPS Worldship allows printers to be defined and named within the world ship software. The printer can be one that is a network printer or a shared printer from another machine.

Each warehouse packing bench has a shared (or network) printer placed on it. Each of these printers is set up in WorldShip with the bench name (BENCH1..2…3..)

The GP fulfilment then sends that printer name of the bench that machine is on, within the XML for the despatch. The Worldship auto XML import picks up that printer name. That ensures that when the shipment is processed, the labels are routed to the originating warehouse packing bench thermal printer.

Drivers

UPS use a bespoke driver for the Zebra thermal printers. In fact they even have their own badged version of the Zebra printers with UPS firmware on. This lets them get the fonts and layout of their label correct, so if using a non-UPS Zebra thermal printer you must install the UPS drivers on the machine and go into the printer settings and replace the stock driver with the UPS one. This works even for old DA402 up to the more modern GK series printers.

If you are putting your own logo on the UPS labels by using the custom label templates within Worldship, and thus are using the extended size label stock, you must also ensure that you use the zebra printer media calibration, found in the printer settings tools section to calibrate the printer stock size.

 

Thus we can now fulfil orders on many warehouse packing benches with one Worldship instance and thermal labels will print on the correct bench’s thermal printer.

Dynamics GP 2018 wrong version of GP2018 Power Shell included

$
0
0

Be aware of an issue with the GP2018 installation media

GP2018 Power Shell Additional Product is version 2016

You can see above that the installation media MDGP2018_RTM_DVD_ENUS has under the Additional Products the GP2016 installer for Power Shell.

The correct installer has been provided by Microsoft under Customer Source here: https://mbs.microsoft.com/customersource/northamerica/GP/downloads/service-packs/mdgp2018_dexterityreleases

It is not obvious that a administration module should be under something as specific as Dexterity Development System – buy hey at least it IS available!


Dynamics GP 2018 Power Shell installer found under Dexterity Development System Releases for Dynamics GP 2018

Installing Docker onto Windows Server 2016

$
0
0


https://docs.docker.com/install/windows/docker-ee/

Docker comes in two editions, free community edition and enterprise edition.

The company Docker and Microsoft entered into a commercial agreement to bring Docker to windows server as a commercially supported container enterprise product.

Docker running containers on Windows is the result of a two-year collaboration between Microsoft that involved the Windows kernel growing containerization primitives, Docker and Microsoft collaborating on porting the Docker Engine and CLI to Windows to take advantage of those new primitives and Docker adding multi-arch image support to Docker Hub. (https://blog.docker.com/2016/09/dockerforws2016/)



As a result of the agreement Docker Enterprise is already licenced as part of Windows Server 2016. So you just need to install and use it!


There are two types of containers, Linux and Windows. To run Windows OS containers, you must install the windows provider. Choose the appropriate container type below and issue the relevant commands.

For Install for Windows containers:

Install-Module DockerMsftProvider -Force

Install-Package Docker -ProviderName DockerMsftProvider -Force

For Install for Linux Containers:

Install-Module DockerProvider –Force

Install-Package Docker -ProviderName DockerProvider –Force


When installed there will be a Docker Service on the machine:

Docker Service

When the command has finished executing (can take some time), then powershell should be aware of Docker commands. Thus issuing the following command:

Docker version

will return information about the Docker version.

Docker Version


Uninstall docker from Windows Server 2016

Uninstall-Module dockerprovider

Uninstall-package docker

or for windows container:

Uninstall-Module DockerMsftProvider

Uninstall-package docker

Docker build and UilttiuyVM: The parameter is incorrect - Failed to regis error

$
0
0

The following error can occur when doing a Docker build.

Docker image

Docker . failed to register layer: re-exec error: exit status 1: output: processUtilityWImage

\\?\C:\ProgramData\docker\windowsfilter \UtilityVM: The parameter is incorrect.

At line: 2 char: 1

+ Docker build - -tag tw/gp2018 .

+ CategoryInfo :NotSpecified: (failed to regis…r is incorrect.:String) []. Remote Exception

+FullyQualifiedErrorId : NativeCommnadError

 

On a windows build, this is often due to the base image not being compatible with operating system of the Docker host machine.

For example, I moved my dockerfile and build directory from Windows 10 to Windows Server 2016 and experienced this error.

Turns out that the windowscore I was attempting to use was not compatible.

See here https://docs.microsoft.com/en-us/virtualization/windowscontainers/deploy-containers/version-compatibility for the guide on the windows container versions compatible with different hosts.

dockercompat

I was attempting to use microsoft/windowsservercore:1803, but the only version that is compatible on Server 2016, from the above table, without using Hyper-V isolation (where a small Linux machine is used to host container), is build 14393.

Thus in my dockerfile, changing the base image instruction to

FROM microsoft/windowsservercore:ltsc2016

solved the above error on the server host.

The reason is obvious really, Docker works by sharing the underlying kernel from the host with the containers, overlaying layers of files until you reach the image. Changes to the host operating system will adversely affect this layering, as expected components may not be present that were there when the image was created. The container my start but fail later.

Using Hyper-V isolation works, as the Hyper-V isolation introduces its own kernel instead of the host’s thus isolating the container from the host operating system. This still gives us benefits if multiple containers are ran, and the benefits of portability are still present, but less efficient on disk storage. Use the switch parameter –isolation=hyperv to enable this isolation.


GPUG Summit 2018 Phoenix Arizona–October 15-18th

$
0
0

If you are into Dynamics GP, Power BI or Business Central, then head over to GPUG, BCUG & Power Summit summit this October in Arizona – even better, bring your “IT crowd” too!

 

GPUG Summit Site Phoenix, October 15-18, 2018

This event has over 215 GP –specific sessions and a separate parallel conferences going on “under the same roof” for business central and Power BI. There will be lots of ISVs and networking opportunities in the expo hall and corridors.

Session listing can be found here for the three conferences:

(Power BI) PowerBIUG Power Summit

(Business Central) BCUG Summit

(Dynamics GP) GPUG Summit

 

It is difficult to emphasise enough the value that can be found in attending this nature of event, with a world class concentration of expertise in the Microsoft Dynamics products housed in the same place for a week. There is a strong sense of community and endless opportunities to get the answers to long held questions, or discover new products and features that you never realised existed. This knowledge could be priceless when brought back to the workplace after the event.

Lots of practical sessions are available on how to better utilise the investment you have in your company systems. This includes discovering the tools you didn’t even realise you already had at your fingertips. Notably, this year there is a real growth in the developer and IT pro tracks – so not only will the accounting, purchasing, warehousing, sales, admin employees find value in attending, now the IT crowd can come along and there will be lots for them to get their teeth into too.

Have you ever considered how much more could be achieved in your company by growing a better understanding of the financial systems that drive the business,  by getting buy in and understanding from your IT support staff and developers? Trust me, once they see the kind of things that are possible from the exhibiting companies and from the technical sessions, they will return enthused and ready to help facilitate change, perhaps brining in improvements (great and small) for everyone! –my message is simple, BRING YOUR IT CROWD TO SUMMIT 2018!

  • Newly released Microsoft Dynamics G 2018 R2 (October 1st) will be in action on stage and within sessions
  • Meet new peers, reunite with existing friends and mentors within the GPUG community
  • GPUG Medics will be onsite and available all week to talk you through your unique GP challenges
  • BI track for Power BI you can attend Power Summit sessions at NO additional cost

See my write up of the 2016 Summit in Tampa for an example of what to expect.

As a bonus I will be presenting two sessions, so see you there!

Are you a Tab or Enter kind of person

$
0
0

Tab or Enter?

Navigation from field to field on a form in Microsoft Dynamics GP can be performed using the [Tab] key, or if the option has been set the [Return] key on the keyboard.

A quick internet search reveals this ability to change field navigation preference is a standard feature of software such as; Quickbooks, Xero, Quicken, and no doubt most of the other similar products too.

TabEnter

When creating an add-in form for Dynamics GP, don’t forget to support this unless you want user backlash.

Green screen, serial terminals

The first ERP system I used was literally a green screen application, you didn’t have a mouse, instead function keys allowed you to quickly jump around the screen functionality. Navigation from field to field was achieved by pressing the enter key repeatedly until you were at the field you wanted to edit. The field content is then changed and press enter again. When the windows operating system came along, the Tab key became the standard navigation field for moving between fields on forms. My thoughts are that, for those still hard wired in their heads to use Enter key from non –graphical user interface applications, software vendors still supported this, to help the user in the transition to the application running on windows. Perhaps green screen allowed the tab key back then too, I’m unsure, I, like others just used Enter!

Whatever the history is, in GP we can, in the user preferences set this. 

2018-09-13_16-41-12

I personally have not used Enter as an option since using GP, but curiosity got better of me…

If i query this preference for all users, to find what the users are using, i find,

SELECTCOUNT(*)
,CASE
WHEN MDFRDENT = 0
THEN'Tab'
ELSE'Return'
END
FROM DYNAMICS..SY01400
GROUPBY MDFRDENT
 
Results:
CountSetting
122Return
59Tab

Looking at the figures, it seems users prefer to use the return key. This might not be the whole picture, many users may not even realise they can change this preference, the user default in our case set to Return. Tab is the windows default for navigation, so are we setting our users up for digital failure in their lives outside work, where they find they can’t use the Enter key? For example, on the internet, in browsers, we find that the Enter key will normally submit a form.

Behaviour

There is data saying that only 10% of regular (non geeks) tab between forms, the sad truth is the rest leave the keyboard, grab the mouse click, then return to the keyboard. [1] Even worse studies show that users arriving at a form, even though the cursor is blinking in the first form field that they want to fill, will still grab the mouse and set focus on that already focused field before typing -ugh!

It seems that the standard user just doesn’t value speed or ease in their interaction with the device they spend most of their day in front of. Most of us know users will rarely invest the time up front to learn how to use an operating system/application properly, even when it could vastly improve their efficiency, they just don’t value it.

The keyboard power user

While we are talking about this kind of thing, there is another kind of user worth mentioning, rare to find but amazing to watch in action. In my experience, they tend to be an older user, brought up on green screen technology. These users don’t leave the keyboard, in fact they raise a big sticking objection if the form tab order is not correct and if they can’t do everything without the mouse and with shortcut keys or key chords. These users use the application like they are playing a familiar song on the piano, using muscle memory to enter data a lightning speed, without taking a breath (grabbing the mouse).

Enter vs Return key

After all that we remind ourselves that the Enter key and Return Key are actually two different keys that once did different things, look at your numeric key pad for enter and big button for return…

Remove or delete KDSRootKey (KDS Root Key)

$
0
0

When creating GMSA (group managed service account) for Docker it is easy to run scripts too many times leaving yourself with multiple KDSRootKeys – I’m not aware of a Powershell command to remove them, but this user interface based method works to delete the unwanted KDS Root Keys.

To view the kds keys

Use the Powershell command;

Get-KdsRootKey

This will list the keys as shown with the KeyId that will be required next.

get-kdsRootkey

 

If you accidentally created more than one key by running the scripts to create a GMSA user multiple times, then you may delete the keys you don’t need. The creation time will give you a clue as to the keys you don’t need.

 

To remove or delete the KDSRootKey

From the run command type

dssite.msc

dssite.msc

This will launch the Active Directory Sites and Services, use the View menu to select “Show Services Node”.

show services

From the displayed keys, they should match the previous key ids and by clicking on the key you wish to delete, it can be deleted by right click > Delete or the Action menu > Delete

It is possible to view the created date properties by right clicking each key and selecting properties, then selecting the object tab. However I prefer the PowerShell method as it presents a nicely formatted list in one hit, when there are many keys created this can be more efficient. 

DeleteKdsKey

Microsoft Dynamics Community Blogs Notifications now enabled again

$
0
0

It used to be possible to subscribe and thus receive and email whenever blog content was updated in the community blogs.

Microsoft community forum notification icon on blogs

The ability to have an email sent when content changes in the community blogs has been restored after it was lost in the big redesign earlier in the year.

I was recommending that Steve Endow @steveendow should be trying this feature to be notified of new blog postings, only to find that the feature was no longer available! After brining this to the attention of the team responsible, they have kindly restored the functionality. So if you want to know when a new post goes on the blogs, use the bell icon at the top of the blog as shown in the screenshot.

Configure notifications as digest frequency

To configure the email settings go to your picture at the top right and use the drop down to get to settings.

Profiles ettings

Select the Email tab at the top of the page, then scroll down to the email settings:

email tab2


Settings


For each group you can change the frequency and nature of the mail from that group (blog).

*Unofficial luggage tag - Phoenix Summit 2018

$
0
0

Summit2018Tag800


The above model will download into Windows 10 Print 3D or Paint 3D etc

If you have a 3D printer feel free to download the STL for the above luggage tag.

Summit2018.stl

Deploy Dynamics GP during development using Visual Studio External Tools Menu

$
0
0

I’ve never written this up before, but is a useful technique I use to allow the deployment of GP from the current project active build directory to the application addins folder.

Note that in this example PowerShell script, it will empty the addins folder before deploy. This may not be appropriate where other addins are present, tweak the deploy.ps1 script as appropriate for your development environment.


Setup external tool

Create a Deploy directory in the root of the project you wish to enable for copy deployment to the GP addins folder.

Save the following Deploy.ps1 script to that folder

Param(
   [string]$deployFrom,
   [string]$deployTo
)
Write-Host "Deploy from: $deployFrom To: $deployTo"
Remove-Item -Recurse -Force "$deployTo\*"
Get-ChildItem -Path $deployFrom -Recurse | Copy-Item -Destination $deployTo
Write-Host "Deploy complete"

Set up the external tool for launching the powershell script:

Deploy GP with External Tools

Command:

C:\windows\system32\windowspowershell\v1.0\powershell.exe


Arguments:

-File "$(ProjectDir)\Deploy\Deploy.ps1" "$(TargetDir)" "C:\Program Files (x86)\Microsoft Dynamics\GP2018\AddIns"

Title:

Deploy GP Add in


Use Output window checkbox checked


This will add a menu under the Visual Studio Tools menu of “Deploy GP Add in”, that will run the powershell script added to the project. The script takes the current project folder and uses it as a source to deploy GP from the current build target directory. Thus the deployment will honour the configuration type currently selected and the project currently selected will be selected as source.


To assign a keyboard shortcut

From Visual Studio Menu select Tools>Customize

Click “Keyboard” button

Keyboard button

Use the show commands containing: search box to search for tools.external:

AssignKeyboardShortcut

Then assign a shortcut by clicking the shortcut key combations while focus is on the “Press shortcut keys:” field


Problem with extensions search in visual studio

$
0
0

If you are using Dynamics GP with Visual Studio 2017 –>  , you will find that the project templates for Dynamics GP do not show up in the new project or new item menus.

I wrote a solution to this in the form of a Visual Studio extension, but it seems that the extension is not showing up when it is searched for in the extension gallery, something brought to my attention via the Microsoft forums.

 

However if you page through the results then go back to page one of the results, then all of a sudden it appears in the list!

 

I’ll be bringing this to the attention of the VS team, in the meantime the video shows this work around.


Diagnosing PowerShell New-GPSystemDatabase error when creating the Dynamics GP system database

$
0
0

The PowerShell modules for GP allow a number of DevOps activities. For example you may install the Dynamics GP Lesson (sample) company TWO and install the system database via these PowerShell modules. I take full advantage of the PowerShell modules when running and building Docker images for Dynamics GP. 

Today whilst doing this I encountered the following error when moving to build the image for a different version of Dynamics GP:

Dynamics GP - Macro failed to complete successfully

Macro failed to complete successfully.

The PowerShell script works by creating a macro file in the GP\data folder, the PowerShell script then runs DexUtils passing the macro file as a command parameter. The macro file tells utilities what it needs to do (in this case create the system database), you can see this macro in the folder below, captured after this error occurred in the Docker container. CreateSysetmDatabase.mac is the macro file that PowerShell is using to create the system database. 

CreateSystemDatabase.mac

We have no idea at this point what the error actually is, as the error description of “Macro failed to complete successfully” is not very helpful. However you will notice also in that folder the “DexMacro.log” file. Lets look in that file by reading it with the command “type DexMacro.log”.

2018-10-25_21-54-55

So below is the output of that command, the contents of the log.

GP SQL server version issue

There we can see the issue, I’m using SQL Server 2017, I already know this is only compatible with GP2018 onwards. However you will notice from the folder name I’m trying to use it with GP2016. The error is badly worded, it says you need to upgrade to server 11 from 14. It actually means you need to use an older version of SQL for this version of Dynamics GP. Something I already knew but my focus wasn’t in this area at the time I was developing the Docker container.


Please comment if you found this helpful – motivates me to blog more!

Protocol handler debugging in Dynamics GP (drill down) problems

$
0
0

In previous blog posts we have talked about the way that other applications can drill down into parts of GP by using an direct approach via a pluggable protocol handler that is installed when GP is installed. Sometimes this does not work, to debug if the protocol handler has an end point to talk to follow these instructions.

 

Get Handle.exe

Download Handle.exe which is part of the SysInternals tools that Microsoft acquired some time back, written by Mark Russinovich,

https://docs.microsoft.com/en-us/sysinternals/downloads/handle

Extract the Zip file, if you have 64 bit system use the 64 bit version, otherwise use the plain version.

Handle viewer archive extracted

Use PowerShell to pipe out and decode handle.exe output

Open a PowerShell editor on your machine (e.g. start menu, search powershell, launch Windows PowerShell ISE)

In my case I extracted the handles.exe into the C:\Users\tw\Documents\ folder, you will need to change the path for where you extracted it to…

Execute the following PowerShell Command:

C:\Users\tw\Documents\handle64.exe net.pipe -accepteula

This will accept the user agreement and pipe the output of the handle viewer to powershell for decoding.

This gives the following output.

2018-10-31_14-55-08

You can see that visual studio and Dynamics.exe have named pipes running.

If I close Dynamics GP, then run it again…

2018-10-31_15-03-22

You see that Dynamics .exe is not listed as having named pipes available anymore as it is closed.

2018-10-31_15-15-42

[System.Text.Encoding]::UTF8.GetString([Convert]::FromBase64String("bmV0LnBpcGU6Ly8rLw=="))

net.pipe://+/

 

Using powershell to list all

[System.IO.Directory]::GetFiles("\\.\\pipe\\")

Prevent SQL job running when another job is running

$
0
0
 
CREATEVIEW myschema.Admin_RunningSQLJobs
AS
SELECT job.NAME
,job.job_id
,job.originating_server
,activity.run_requested_date
,DATEDIFF(SECOND, activity.run_requested_date, GETDATE()) AS Elapsed
FROM msdb.dbo.sysjobs_view job
JOIN msdb.dbo.sysjobactivity activity ON job.job_id = activity.job_id
JOIN msdb.dbo.syssessions sess ON sess.session_id = activity.session_id
JOIN (
SELECTMAX(agent_start_date) AS max_agent_start_date
FROM msdb.dbo.syssessions
) sess_max ON sess.agent_start_date = sess_max.max_agent_start_date
WHERE run_requested_date ISNOTNULL
AND stop_execution_date ISNULL


Using this view we can check for a running job before running our job…
In this example we don’t want the nightly pricing build to run if a monthly build is in progress, note, nightly is scheduled later than monthly. So wrap the call to the stored procedure in the job step like this…
 
IFNOTEXISTS(SELECT * FROM myschema.Admin_RunningSQLJobs WHERE name='Pricing - Monthly Rebuild') 
BEGIN
EXEC myschema.Nightly Pricing build
END

 
 

SQL Server upgrade step 'msdb110_upgrade.sql' encountered error 2714, state 6, severity 25

$
0
0

Upgrading SQL server to SP4 or SQL Server 2016 encountered error 'msdb110_upgrade.sql' encountered error 2714, state 6, severity 25

When doing an dummy run for a SQL instance upgrade, I encountered this error and it resulted in the SQL service not starting after upgrade and the upgrade wizard reporting errors.

After a couple of attempts I had to dig into it to find what was going on so referencing the article SQL Server Service fails to start after applying patch. Error: CREATE SCHEMA failed due to previous errors I tried deleting the DatabaseMailUserRole Schema from msdb but the server still failed to start.

This was SQL Server 2012 so I checked the SQL server logs found in at C:\Program Files\Microsoft SQL Server\MSSQL10_50.MSSQLSERVER\MSSQL\Log\ERRORLOG.txt (that path may vary between versions).

The tail of the log looked like the following, where I had already removed the DatabaseMailUserRole, so it may have reported that too previously:

019-02-15 17:04:20.53 spid4s      Setting object permissions...
2019-02-15 17:04:20.63 spid4s      Error: 2714, Severity: 16, State: 6.
2019-02-15 17:04:20.63 spid4s      There is already an object named 'TargetServersRole' in the database.
2019-02-15 17:04:20.63 spid4s      Error: 2759, Severity: 16, State: 0.
2019-02-15 17:04:20.63 spid4s      CREATE SCHEMA failed due to previous errors.
2019-02-15 17:04:20.63 spid4s      Error: 912, Severity: 21, State: 2.
2019-02-15 17:04:20.63 spid4s      Script level upgrade for database 'master' failed because upgrade step 'msdb110_upgrade.sql' encountered error 2714, state 6, severity 25. This is a serious error condition which might interfere with regular operation and the database will be taken offline. If the error happened during upgrade of the 'master' database, it will prevent the entire SQL Server instance from starting. Examine the previous errorlog entries for errors, take the appropriate corrective actions and re-start the database so that the script upgrade steps run to completion.
2019-02-15 17:04:20.63 spid4s      Error: 3417, Severity: 21, State: 3.
2019-02-15 17:04:20.63 spid4s      Cannot recover the master database. SQL Server is unable to run. Restore master from a full backup, repair it, or rebuild it. For more information about how to rebuild the master database, see SQL Server Books Online.
2019-02-15 17:04:20.63 spid4s      SQL Server shutdown has been initiated
2019-02-15 17:04:20.63 spid4s      SQL Trace was stopped due to server shutdown. Trace ID = '1'. This is an informational message only; no user action is required.
2019-02-15 17:04:20.80 spid14s     The SQL Server Network Interface library successfully deregistered the Service Principal Name (SPN) [ MSSQLSvc/***] for the SQL Server service.
2019-02-15 17:04:20.80 spid14s     The SQL Server Network Interface library successfully deregistered the Service Principal Name (SPN) [ MSSQLSvc/***:1433 ] for the SQL Server service.

So it seems I had the same problem as the referenced blog post but with another schema role.

Solution

After running the upgrade and it failing, I started SQL server with the trace flag 902, using net start mssqlserver /T902 from an elevated command prompt. This prevents the startup scripts running.

image

Then connected to the SQL server instance using SSMS and located the 'TargetServersRole'  under the msdb database and right click, deleted it.

image

I then stopped SQL server and restarted it normally, without the trace flag, so it runs the 'msdb110_upgrade.sql' at startup again.

This time it started normally.

image

 

Problem sorted!

 

If this was helpful please comment as it motivates me to blog more!

SQL server Optimize for Ad hoc Workloads setting in server advanced properties window

$
0
0

TL;DR Turn this option on, from false to true

Whilst looking at a problem SQL server instance, I was on the diagnostic journey looking at why the query plan cache was getting totally cleared every few minutes. It turned out the server had bad memory setup and SQL server was suffering some bad memory pressure. However I did learn about a setting that I find hard to think of a reason why you’d not want to use it in a typical (what install is typical tim?) setup. This option Optimize for Ad hoc Workloads tells SQL server to not cache the query plan for a query until it sees it twice.

Server properties window - highlighting Optimize for Ad hoc Workloads setting

 

This is important where the SQL query work load is very varied, particularly where ad-hoc, non-stored procedure queries are being ran as they can bloat the query cache. The query cache is used to store the compiled execution plan required to execute a particular query. When a new query is encountered, the plan is calculated and put in the cache to save having to compute the plan again, if the query is seen again. The problem is that with ad-hoc queries they are unlikely to be seen again, thus SQL server is using up memory unnecessarily that could be used for better things.

I understand, but need to verify, that this is also true of ORMs such as Entity framework, that although it does create parameter based SQL to execute, in the SQL it sends to the server, the length of those parameters can vary depending upon the length of the values in those parameters. Thus this can create a large number of query plans for what are essentially very similar queries. (OK technically the length of the searched text can vary the query plan but run with this).

The setting will greatly reduce the number of plans and memory used for them on the server as only if they the query is seen twice will it be fully cached. The first time it is seen a stub is created that is enough to spot if the query is seen a second time, only then will the query plan be cached. Truely ad-hoc queries wont be seen again and space in the cache is saved. The compile time is not really worth worrying about because having to compile the query twice is no big deal as after the second compile, the further 100,000k executions can come from the cache, so proportionally its an efficiency worth having for a tiny, tiny hit of compiling the query plan twice.

There is a good article here on how start looking into if you have bloat,http://www.sqlservercentral.com/articles/SQLServerCentral/91250/

There is a similar setting of Forced Paramatization for details of this setting see this post by Brent Ozar,  https://www.brentozar.com/archive/2018/03/why-multiple-plans-for-one-query-are-bad/ basically this setting forces the server to look more carefully at the plans and infer where parameters would exist if you were to paramatize the query. This is great for reducing plans in the cache but increases processor work as it has to examine queries a lot more to work out where the parameters lie. It can also lead to bad parameter sniffing as before the plans would have been totally bespoke, now they will be shared and the exact parameter can change the optimal plan.

Viewing all 195 articles
Browse latest View live