Selasa, 26 September 2017

Microsoft Dynamics GP Security and Audit Field Manual

Microsoft Dynamics GP Security and Audit Field Manual

My friends, MVP Mark Polino (@mpolino) and Andy Snook (@snookgofast), both members of the Fastpath team, have just released a comprehensive security book titled, Microsoft Dynamics GP Security and Audit Field Manual.

The book can be found in printed and Kindle formats on Amazon.com and I encourage you to get a copy, read up, and put into practice as this book goes beyond the boring task of assigning security to windows and reports just to prevent someone from accessing some area of the application, and into the realms of compliance, separation of duties, and audit controls.


The book can be found in printed and Kindle formats on Amazon.com and I encourage you to get a copy, read up, and put into practice as this book goes beyond the boring task of assigning security to windows and reports just to prevent someone from accessing some area of the application, and into the realms of compliance, separation of duties, and audit controls.

Finally, I want to take the opportunity to thank both Mark and Andy for extending me an invitation to write the foreword to their book. 

Until next post!

MG.-
Mariano Gomez, MVP

#DevOps Series: Building Dexterity Applications with Visual Studio Team Services - Summary





My DevOps series has concluded, although, I believe this will not be the first or the last time I write about this subject. DevOps is here to stay and the tools and technologies to support development teams only keep getting better.

The following is a list of the topics I covered in the series and I encourage you to add your comments to the comment section of the posts that caught your attention. Let me know what you are doing today and how you plan to incorporate DevOps into your development operation processes.

July 17 - #DevOps Series: Microsoft Dexterity Source Code Control with Visual Studio Team Services

July 17 - #DevOps Series: Upgrading Microsoft Dexterity VSS and TFS repositories to Visual Studio Team Services - Part 1/2

July 19 - #DevOps Series: Upgrading Microsoft Dexterity VSS and TFS repositories to Visual Studio Team Services - Part 2/2

Aug 01 - #DevOps Series: Building Dexterity Applications with Visual Studio Team Services Part 1/3

Aug 16 - #DevOps Series: Building Dexterity Applications with Visual Studio Team Services Part 2/3

Sep 25 - #DevOps Series: Building Dexterity Applications with Visual Studio Team Services Part 3/3


I also prepared this video, which I originally intended to add to the previous article, but I am really glad I left it for the summary post. Please be sure to check it out (better viewed in full screen mode).



A link to the Helper.ps1 script containing the library of functions used by the PowerShell scripts found in the previous article can be download from my OneDrive public share, here. These scripts are being updated constantly as we evolve our Build-Engine. Check back for additional updates.

Until next post!

MG.-
Mariano Gomez, MVP

Senin, 25 September 2017

#DevOps Series: Building Dexterity Applications with Visual Studio Team Services Part 3/3

In part 2 of this series, I covered how to setup a Build Definition of for our Build-Engine project. I also began showing the steps required by the Build-Engine definition in order to take your development project from the dictionary to an actual set of extracted dictionaries and chunk files that can be delivered to your QA team.

NOTE: the same process can be used to take your dictionaries from QA to release to your download site.

The first step, as shown before, is to determine the source for the Build-Engine process. We said that we would use the Build-Engine project itself as source for the Build process, since it contains all our Dexterity (and Dexterity Utilities) files, PowerShell scripts, and macros to make it all happen.


1. Following the selection of the source, our first task is to create the necessary folders to host the various files. This tasks uses an inline PowerShell to do this:

CreateFolders (inline PS script)
$folders = @("Build", "Source", "Logs", "Generic", "Temp")  # Create these folders

foreach($item in $folders)
{
mkdir "$(Get-Location)\$($item)\" -ErrorAction SilentlyContinue | Out-NULL
}

The task creates the following folders:

Build: stores chunk files with no source code

Source: stores the extracted dictionaries and chunk files

Logs: stores all the log files generated by Dexterity Utilities in the process of extracting and chunking dictionaries

Generic: stores the downloaded Dexterity project repository files

Temp: stores any additional component needed throughout the Build process

2. Once we have set up the needed folders, we can then proceed to retrieve our Dexterity project from the VSTS repository. For this purposes, we setup a task that will run our Get_VSTS.ps1 PowerShell script.

Get_VSTS.ps1
Param(
[string]$SingleModule = "",
[int]$BuildNumber,
[string]$VSTSUser,
[string]$VSTSUserPAToken,
[switch]$TestStructure
)

. "$(Get-Location)\Scripts\Helper.ps1"

$modules = Get-ModuleData -Module $SingleModule
if ($modules.Status -ne 0) {
Write-Host "Invalid Module : $($SingleModule)" -ForegroundColor Red
exit
}

$sourceModule = $modules.SourceFolder
Write-Host "Pulling Module : $($modules.Selected)" -ForegroundColor Green

# ==============================
# Retrieve source files to pull.
# ==============================
$baseWebFolder = "$/MICR/Base/2/2015B$($BuildNumber)/"
$SourceCodeFolder = "$($baseWebFolder)/$($sourceModule)" # Where to pull from.
$genericFolder = "$(Get-Location)\Generic\" # Where to push files to.

$scopePath_Escaped = [uri]::EscapeDataString($SourceCodeFolder) # Need to have this in 'escaped' form.

Write-Host "`tfrom $($SourceCodeFolder)`n`tinto $($genericFolder)`n"

$recursion = 'Full' # OneLevel or Full
#$recursion = 'OneLevel' # or Full

# Base64-encodes the Personal Access Token (PAT) appropriately
$base64AuthInfo = [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes(("{0}:{1}" -f $VSTSUser, $VSTSUserPAToken)))

# Construct the REST URL to obtain the MetaData for the folders / files.
$uri = "https://somedomain.visualstudio.com/DefaultCollection/_apis/tfvc/items?scopePath=$($scopePath_Escaped)&recursionLevel=$($recursion)&api-version=2.2"

# Invoke the REST call and capture the results
$result = $null
$result = Invoke-RestMethod -Uri $uri -Method Get -ContentType "application/json" -Headers @{Authorization=("Basic $($base64AuthInfo)")}

# This call returns the METADATA for the folder and files. No File contents are included.
if ($result.count -eq 0)
{
     throw "Unable to locate code at $($SourceCodeFolder)"
}
$result

# ==============================================
# Create folder structure and sort file objects.
# ==============================================
$script:startTime = Get-Date
$sortedFiles = New-Object 'System.Collections.Generic.SortedDictionary[string, string]'

$_removeLength = $baseWebFolder.Length
for($index=0; $index -lt $result.count; $index++)
{
# $_path = $result.value[$index].path.substring($_removeLength)
$_path = "$($genericFolder)$($result.value[$index].path.substring($_removeLength))" -replace "/", "\"
if ($result.value[$index].isFolder -eq $true)
{
Write-Host "`t$($_path)" # -BackgroundColor Blue -ForegroundColor Yellow
New-Item -Force -ItemType directory -Path $_path | Out-Null
}
else
{
$sortedFiles[$_path] = $result.value[$index].url
}
}

# =======================================================
# Create a runspace pool where $maxConcurrentJobs is the
# maximum number of runspaces allowed to run concurrently
# =======================================================
$script:maxConcurrentJobs = 10
$script:asyncObj = $null
$Runspace = [runspacefactory]::CreateRunspacePool(1,$script:maxConcurrentJobs)

# Open the runspace pool (very important)
$Runspace.Open()

#$script:Authorization = @{Authorization=("Basic {0}" -f $base64AuthInfo)}
$script:Authorization = @{Authorization=("Basic $($base64AuthInfo)")}
$SortedFiles.GetEnumerator() | foreach {
# Create a new PowerShell instance and tell it to execute in our runspace pool
$ps = [powershell]::Create()
$ps.RunspacePool = $Runspace

# Base command to 'BeginInvoke'
# Invoke-RestMethod -Uri $using:remote -Method Get -ContentType "application/json" -Headers @{Authorization=("Basic {0}" -f $using:base64AuthInfo)} -OutFile $using:local

[void]$ps.AddCommand("Invoke-RestMethod")
[void]$ps.AddParameter("OutFile",$_.Key)
[void]$ps.AddParameter("Uri",$_.value)
[void]$ps.AddParameter("Method","Get")
[void]$ps.AddParameter("ContentType", "application/json")
[void]$ps.AddParameter("Headers", $script:Authorization)

# Begin execution asynchronously (returns immediately)
$script:asyncObj = $ps.BeginInvoke()
}

# ==========================================
## Run the parallel processes to completion.
# ==========================================
if ($script:asyncObj -eq $null) {}
else {
Write-Host "Pulling $($SortedFiles.Count) code files..."
while ($script:asyncObj.IsCompleted -eq $false) {}
Write-Host "`tTime elapsed to pull code: $((Get-Date) - $($script:startTime))"
}

# ================================================
## Change MPP to MMM. Simplifies later processing.
# ================================================
Push-Location $($genericFolder) #Generic
if (Test-Path "$($genericFolder)\MPP" -PathType Any)
{
Remove-Item -Path "MMM" -Recurse -ErrorAction SilentlyContinue | Out-Null ## Remove any previous MMM code.
Rename-Item -path "MPP" -NewName "MMM"
}
Pop-Location ## Back to where the code was.

This script accepts 5 parameters: the module code (we support 5 products currently) which is validated to prevent an empty parameter from being passed. If we pass in "All", all products will be built; the repository user and personal access token, and a parameter to test the folder structure once it's created. These parameters are passed in by the actual Build definition step.

To retrieve the source files, we construct the service URI and also determine where the files are going to be deposited once retrieved. This is determined by the setting up a relative path to the Generic folder we created in step 1.

Once we connect to the service, we begin retrieving the files by using a for() control structure. There are some other steps that are only relevant to the environment for which this Build process has been designed.

3. Upon retrieving the files from the Dexterity project repository, we are now ready to setup module environment variables and compile the dictionaries, in preparation for the extraction and chunking process.

SetupModuleEnvironment.ps1
Param(
[int] $VersionNumber,
[int] $BuildNumber = "000",
[int] $SubBuildNumber = "0",
[string] $SingleModule = $null
)

. "$(Get-Location)\Scripts\Helper.ps1"

# Create the folder structures
$folders = Create-FoldersCommands -Version $VersionNumber -Module $SingleModule
Write-Host "Creating Folders:"
foreach ($item in $folders) {
Write-Host "`t$($item.Folder)"
mkdir $item.Folder -ErrorAction SilentlyContinue | Out-Null
}


# Copy the files, with replacement of text in text files.
# Ensure the files are saved as 'ASCII'.
$files = Copy-FilesCommands -Version $VersionNumber -Module $SingleModule -BuildNumber $BuildNumber -SubBuildNumber $SubBuildNumber
Write-Host "Copying Files:"
foreach ($item in $files) {
Write-Host "`tFrom`t$($item.From)"
Write-Host "`tTo`t`t$($item.To)"
copy $($item.From) $($item.To)
Set-ItemProperty $item.To IsReadOnly -value $false

if ($item.Replacements -ne $null){
$item.Replacements.psobject.properties |
foreach {
$_name = "%$($_.name)%" # The name of the parameter is the text to be replaced, surrounded by '%'
$_value = "$($_.value)"

(Get-Content $item.To) -replace $_name,$_value | Set-Content $item.To -Encoding Ascii
}
}
}

Of particular importance is the fact that we use a PowerShell helper script (Helper.ps1) which contains a number of functions that capitalize on the parameters passed here. The general idea, nonetheless, is to make a number of replacements within the macros that assign product information and build numbers, taking into account the version number of Microsoft Dynamics GP for which we will be creating the chunks; and create the shortcuts for Dexterity and Dexterity Utilities to compile and extract the dictionaries, using the proper dictionaries and macros that will run when the Dex platform executables are launched.

In the closing post, summarizing all the articles within this series, I will attach a copy of the Helper.ps1 script.

4. Upon making these replacements and compiling the dictionaries, we can then proceed to extract and chunk our dictionaries.

ChunkDictionaries.ps1
Param(
[int] $VersionNumber,
[string] $SingleModule = $null
)
. "$(Get-Location)\Scripts\Helper.ps1"

$EXEx = Create-ExecutableCommands -Version $VersionNumber -Module $SingleModule
Write-Host "Building..."
foreach ($item in $EXEx) {
Write-Host "$($item.Version)`t$($item.Module)`t$($item.Message)`t" -NoNewline
Write-Host "`n`t$($item.Executable) : $($item.Timeout) seconds Max.`n`t$($item.Dictionary)`n`t$($item.Macro)`t"

<##>
# keep track of timeout event
$timeouted = $null # reset any previously set timeout
$proc = Start-Process -filePath $item.Executable -ArgumentList @($item.Dictionary, $item.Macro) -PassThru
# wait up to x seconds for normal termination
$proc | Wait-Process -Timeout $item.Timeout -ea 0 -ev timeouted
<##>
$msg = "Finished."

if ($timeouted)
{
# terminate the process
$msg = "Time Out!!"
$proc | kill
}
elseif ($proc.ExitCode -ne 0)
{
# update internal error counter
$msg = "Error: $($proc.ExitCode)."
}

Write-Host "`t$($msg)"
}

Once again, this script takes advantage of the PowerShell helper script library to extract the source code from the development dictionaries and auto-chunk the extracted dictionaries. Note that this script takes in the version of GP to determine the proper version of Dexterity and Dexterity Utilities to launch. This process is completed twice: once for chunks with source (Remove Unused Block in Dexterity Utilities Auto-Chunk option) and another for chunks without source (Total Compression). The source chunks are moved to the Source folder on the Build agent and the object chunks are moved to the Build folder on the Build agent.

NOTE: the Source and Build folders are created by the CreateFolders inline PowerShell script in task 1 above.

5. Upon finalizing the extraction and chunking process of the dictionaries, we move the chunk files with no source code (Total Compression chunks) to the Build sub-folder in the artifacts directory. The artifacts folder is where all resulting files will be stored after the process itself is complete.

Copy Build Artifacts step

6. Then we move the chunks and extracted dictionaries with source code to the Source sub-folder in the artifacts directory.

Copy Source Artifacts
The following Microsoft Docs article talks about Artifacts in Release Management in more detail.

7. Finally, since the Build Agent is volatile, you will need to move the artifacts off the agent and onto a permanent storage location, whether that's on the VSTS servers or a local folder. This is accomplished by publishing the artifacts.

Publish Build Artifacts

My final article in this series will summarize the series and provide links to all previous articles, along with providing a link to the Helper.ps1 PowerShell library.

Until next post!

MG.-
Mariano Gomez, MVP

Rabu, 16 Agustus 2017

#DevOps Series: Building Dexterity Applications with Visual Studio Team Services Part 2/3

I am so amped-up after my return from the Microsoft Dynamics GP Technical Conference 2017 in Fargo, ND, where I had a chance to catch up with my friends in the partner and ISV community (more on that in a later post). This year, I had a chance to introduce the topics I have been discussing here in my DevOps series and now that I am back, I want to continue writing about the subject as it gets more and more exciting.

In Part 1 of this specific chapter within the series, I talked about building the actual Build-Engine project. If you remember, I specifically said that the build templates provided by Visual Studio Team Services (VSTS) do not fit the bill for Dexterity projects. Dex projects tend to be a bit more cumbersome since we need to have the entire IDE around to compile, extract, and chunk our products. So, it's best if we can isolate these components into an altogether separate project (from that of our actual Dex product) for clarity sake and to maintain our own sanity.


Creating a Build Definition for your Build-Engine Project

Now that we have the Build-Engine project in place, we can proceed to setup a Build Definition. The Build Definition is going to encompass all of the steps required to do things like:

1. Download the resources from our Build-Engine project (Dex IDEs, clean dictionaries, PowerShell scripts, macro templates, etc.

2. Setup any folders needed to support the build process and temporarily store files, etc.

3. Pull the source code from our Dexterity project repository

4. Setup all environment variables

5. Extract dictionaries and create chunk files with (unused blocks) and without source code (total compression).

6. Copy the chunks without source code into an artifact folder

7. Copy chunks and source dictionaries for debugging into an artifact folder

8. Publish the artifact folder

To create a new build for our Build-Engine project:

1. Click on the Build & Release then click the New button.


2. Select an empty template. Dexterity projects, clearly do not conform to any of the existing, pre-defined molds.


3. Click the Apply button to continue.

4. You can now enter the name of your Build-Engine and select from a list of 4 agent queue modes. You can select from Default, Hosted, Hosted Linux Preview, or Hosted VS2017. For all intends and purposes, hosted build agents pools run in the cloud, but can run locally as well. For more information on Hosted Agents, click here. These options define the Build process itself.


The suited option for our Dexterity Build-Engine is Hosted.

5. On the left pane, we can now click on the first task, Get Sources, to identify where the resources for our Build-Engine will come from. In this case, they will come from our Build-Engine project itself, which contains the Dexterity IDEs for versions 12 (GP 2013), 14 (GP 2015), and 16 (GP 2016). All other options are defaulted and really not required to be changed.



This completes the first step (Download Resources for our Build Engine) for today. You can click on Save & Queue to test that all files download properly for the build agent pool.


Tomorrow, I will show you how to leverage Visual Studio Team Services' Build process to extract and chunk your Dexterity applications.

Until next post!

MG.-
Mariano Gomez, MVP

Senin, 17 Juli 2017

#DevOps Series: Upgrading Microsoft Dexterity VSS and TFS repositories to Visual Studio Team Services - Part 1/2

Yesterday, we talked about #DevOps Series: Microsoft Dexterity source code control with Visual Studio Team Services. The article mainly focused on setting up your Team Services project repository for the first time and taking an existing development dictionary and prepping it and checking in the resources into the repository. But what if you already have a Visual SourceSafe (VSS) or Team Foundation Server (TFS) repository already in place and you are just looking to move to VSTS?



Migrating Microsoft Dexterity repositories from Visual SourceSafe to Visual Studio Team Services

There are two acceptable methods to migrate your Microsoft Dexterity projects repository VSS repository to VSTS: you can use the VSS Upgrade Wizard or you can use the VSSUpgrade command prompt tool. Now, I am a big fan of command-prompt tools, but this is one case where I would suggest you ditch it for the Wizard.

If you would like more information on the VSSUpgrade command-prompt tool, please click here.

Using the VSS Upgrade Wizard 

This is by far, the method I recommend the best. The wizard provides step by step instructions, which makes the process of moving to VSTS a no-brainer. There are a few things you will need to do beforehand.

Preparing for the Upgrade

1.- First, if you are on a version prior to Visual SourceSafe 6.0, you will need to upgrade Visual SourceSafe to version 6, before you can attempt the upgrade. You can download Visual SourceSafe 6.0 here, but please note that this IS NOT an official Microsoft download site, hence, exercise due care when opening any files from an unknown location. Also note that Microsoft support for VSS ended in 2012 - that's right! You are on your own here.

2.- Next, you will need to have a SQL Server available to use as temporary storage to the upgrade process. Since you are already running Microsoft Dynamics GP on some SQL Server, you could probably create a separate instance where you can perform the upgrade. I won't recommend using your production instance to do so.

NOTE: Although SQL Server Express Edition is probably fine for the upgrade, I do recommend you use at the very least SQL Server Standard Edition to prevent any migration issues due to database size limitations imposed by SQL Server Express Edition. If your repositories tend to be very large from years and years of coding (in our case 20 years!) you are probably better off with the Standard Edition of SQL Server.

3.- You will then need to check in all your Microsoft Dexterity project resources into your VSS repository and remove access to all repositories for all developers, but the (main) administrator.

4.- You will have already had to provision a Team Services account. Refer to the previous article in this series for a primer on this process. We found this out the hard way: make sure you create all project shells for your VSS projects before you conduct the upgrade as the Upgrade tool will need this done in advance.

5.- Make a copy of your VSS database and work from the copy. Restore these onto the instance of SQL Server you created in setp 2. Makes sense? Ok, let's move on. As usual, you will not want to expose yourself to some sort of data corruption, so please do not work with your original VSS databases in case something goes wrong. See How To Back Up a Visual SourceSafe Database for additional information on this process.

6.- Download and install the Visual SourceSafe Upgrade Tool for Team Foundation Server (and Visual Studio Team Services). You can get the tool here. You must install the tool on the same machine where you made the copy of your repository database.

7. Run the VSS Analyze Utility to ensure there are no inconsistencies with your VSS database that would prevent the upgrade from being successful. If Analyze produces any errors, you will need to repair the database prior to beginning the upgrade.

7.- For additional preparation steps, please refer to the following MSDN article, Prepare to upgrade from Visual SourceSafe.

Using the Wizard

1. Launch the tool downloaded in Step 6 above. Go to Start and run the VSS Upgrade Wizard.

2. On the Visual SourceSafe Repository page, specify the repository, and if necessary, the Admin password.

Visual SourceSafe Repository page
3. To display the projects in your VSS repository, choose the List Available Projects link. Select the projects you want to upgrade.

List Available Projects
4. Select the check box at the bottom of the page to confirm you have run Analyze. See Step 7 above. Choose Next to proceed.

5. On the Team Project Page, choose Browse and then use the Select a Team project for Migration dialog box to specify the team project into which you want to port the upgraded data. My absolute recommendation here is to select a new team project that you have not been using.

Select a Team Project for Migration page

Choose Next.

6. On the Options page, select whether you want to upgrade the Full history or Tip to omit historical data. When we did this migration, we truncated the data we didn't want to upgrade. That would have been done as an optional step to step 5 above, after all copying the repository database.

Options page
7. On the Options page, specify the name of the SQL Server instance you want the wizard to use for temporary storage.

Options page
Choose Next to continue.

8. Review all settings and choose Next. There will be a checksum to ensure the upgrade can proceed. Choose Upgrade to continue.

9. Once the upgrade is finished, you should be able to navigate to your Visual Studio Team Services account page and verify that all projects have been migrated successfully. If you come across any issues, make sure you print the Migration Report and follow the information provided here to complete additional steps to fix.



Tomorrow, I will walk through the steps to upgrade from TFS to VSTS. Have you completed a migration from VSS to VSTS? I would like to hear your take on it and what "lessons learned" came from executing the migration.

Until next post!

MG.-
Mariano Gomez, MVP