Create-react-app is the defacto for most of the websites that I work on these days. In this post, we will see how to set up a build/deploy pipeline for create react app in Azure DevOps. We will be using the YML format for the pipeline here, which makes it possible to have the build definitions as part of the source code.

    Build Pipeline

    In the DevOps portal, start by creating a new Build pipeline and choose the ‘Node.js with React’ template. By default, it comes with the ‘Install Node.js’ step that installs the required node version and the ‘npm script step’ to execute any custom scripts. The output of the build step must be an artifact to deploy in the Release step. To support this we need to add two steps to the YML file.

    • Install Node.js
    • Build UI (Npm script)
    • Create Archive
    • Publish Artifacts
    # Node.js with React
    # Build a Node.js project that uses React.
    # Add steps that analyze code, save build artifacts, deploy, and more:
    # https://docs.microsoft.com/azure/devops/pipelines/languages/javascript
    
    trigger:
      - master
    
    variables:
      uiSource: "src/ui"
      uiBuild: "$(uiSource)/build"
    
    pool:
      vmImage: "ubuntu-latest"
    
    steps:
      - task: NodeTool@0
        inputs:
          versionSpec: "10.x"
        displayName: "Install Node.js"
    
      - script: |
          pushd $(uiSource)
          npm install
          npm run build
          popd
        displayName: "Build UI"
    
      - task: ArchiveFiles@2
        displayName: Archive
        inputs:
          rootFolderOrFile: "$(uiBuild)"
          includeRootFolder: false
          archiveType: "zip"
          archiveFile: "$(Build.ArtifactStagingDirectory)/ui-$(Build.BuildId).zip"
          replaceExistingArchive: true
    
      - task: PublishBuildArtifacts@1
        displayName: Publish Artifacts
        inputs:
          PathtoPublish: "$(Build.ArtifactStagingDirectory)"
          ArtifactName: "drop"
          publishLocation: "Container"

    The above pipeline generates a zip artifact of the contents of the ‘build’ folder.

    Release Pipeline

    To release to Azure Web App, create a new release pipeline and add the Azure Web App Task. Link with the appropriate Azure subscription and select the web application to deploy.

    Frontend Routing

    When using React, you will likely use a routing library like react-router. In this case, the routing library must handle the URLs and not the server hosting the files. The server will fail to server those routes as you probably won’t have anything to interpret those routes. When hosting on IIS (also for Azure Web App on Windows) add a web.config file to the public folder. This file will automatically get packaged at the root of the artifact. The file has a URL Rewrite config that takes any route and points it to the root of the website and have the Index.html file served. Eg. If the web site has a route ‘https://example.com/customer/1223' and if a user hits this URL directly on the browser, IIS will redirect it to ‘https://example.com' and have the default file (Index.html) served back to the user. React router will then handle the remaining route and server the appropriate React component for ‘Customer/1223’.

    If APIs are part of the same host, then it needs to be excluded from the URL Rewrite. Below config has ’/api’ ignored from being redirected. Same with any URL that matches a file on the server like CSS, js, images, etc.

    <?xml version="1.0"?>
    <configuration>
        <system.webServer>
            <rewrite>
                <rules>
                    <rule name="React Routes" stopProcessing="true">
                        <match url=".*" />
                        <conditions logicalGrouping="MatchAll">
                            <add input="{REQUEST_FILENAME}" matchType="IsFile" negate="true" />
                            <add input="{REQUEST_FILENAME}" matchType="IsDirectory" negate="true" />
                            <add input="{REQUEST_URI}" pattern="^/(api)" negate="true" />
                        </conditions>
                        <action type="Rewrite" url="/" />
                    </rule>
                </rules>
            </rewrite>
            <staticContent>
                 <mimeMap fileExtension=".otf" mimeType="font/otf" />
            </staticContent>
        </system.webServer>
    </configuration>

    Environment/Stage Variables

    When deploying to multiple environments like (Test, Staging, Production), I like to have the configs as part of the Azure DevOps Variable Groups. It allows having all the configuration for the application in one place and easier to manage. These variables are to be replaced in the build artifact at the time of release based on the environment it is getting released. One way to handle this is to have a script tag in ‘Index.html’ file as below.

    <head>
      <script>
        window.BookingConfig = {
          searchUrl: "https://example.com/search-service/",
          bookingUrl: "https://example.com/booking-service/",
          isDevelopment: true,
          imageServer: ""
        };
      </script>
      <meta charset="utf-8" />
      <link rel="icon" href="%PUBLIC_URL%/favicon.ico" />
      ...
    </head>

    This file has the configuration for local development, allowing any developer on the team to pull down the source code and start running the application. Also add an ‘Index.release.html’ file, which is same as Index.html but with placeholders for the variables. In the example, isDevelopment is an optional config and is false by default, hence not specified in the Index.release.html file.

    <head>
      <script>
        window.BookingConfig = {
          searchUrl: "#{SearchUrl}#",
          bookingUrl: "#{BookingUrl}#",
          imageServer: "#{ImageServer}#"
        };
      </script>
      <meta charset="utf-8" />
      <link rel="icon" href="%PUBLIC_URL%/favicon.ico" />
      ...
    </head>

    In the build step, add a command-line task to replace Index.release.html as Index.html.

    This step must be before the npm step that builds the application to have the correct Index.html file packaged as part of the artifact.

    - task: CmdLine@2
      inputs:
        script: |
          echo Replace Index.html with Index.release.html
          del Index.html
          ren Index.release.html Index.html
        workingDirectory: "$(uiSource)/public"

    In the release step, add the Replace Tokens task to replace tokens in the new Index.html file (Index.release.html in source control). Specify the appropriate root directory and the Target files to have variables replaced. By default, the Token prefix and suffix are ‘#{’ and ‘}#’. Add a new variable group for each environment/stage (Test, Staging, and Prod). Add the variables to the group and associate it to the appropriate stage in the release pipeline. The task will replace the configs from the Variable Groups at the time of release.

    I hope this helps you to set up a Build/Release pipeline for your create-react-app!

    2019: What Went Well, What Didn't and Goals

    A short recap of the year that is gone by and looking forward!

    Another year has gone by so fast, and it is again time to do a year review.

    TLDR;

    2019 has been a fantastic year with lots of new learning, blogging, reading, running, and cycling. I started creating content for YouTube. Travel and Swimming did not go as planned. Looking forward to 2020!

    What went well

    Running and Cycling

    I did lots of running and cycling again this year. I wanted to do a couple of events (including a full marathon); however, that did not happen. The only event I did was the Brisbane to Gold Coast 100k cycling, which was my first 100k cycling and a great experience. I got a Tacx Neo towards the end of the year and looking to start using it for structured training in the coming years. For running following the FIRST Running Plan has helped me a lot to improve on my average pace.

    'Strava Summary'

    Blogging and YouTube

    I was a lot more consistent with the number of posts this year. Except for October (while I was on vacation), I had a minimum of 2 posts every month. I am also trying to complement the blog posts with YouTube videos and be more regular at it. I have published 10+ videos since August and trying to build up my channel and content. Subscribe here if you are interested to know every time a new video is published.

    Learning and Reading

    I stumbled across Exercism during the year and found the FSharp track interesting. I completed the core exercises on the track. CSS is something I have always struggled. Towards the end of this year, I took the Advanced CSS and Sass course on Udemy. I am halfway through the course and finding it extremely useful. It helped me heaps to get going with CSS and SASS. I did want to build the Key Vault Explorer; however that never took off.

    As for reading, I had set the goal of 10 books for this year and happy to have finished 11 books. I highly recommend the book Digital Minimalism and the Bullet Journal Method. Here is what I have been experimenting after reading Digital Minimalism and how I have been using the Bullet Journal methodology.

    What didn’t go well

    I happy this year with having set out with the right goals and being able to meet most of them. Here are some things that could have been better.

    • Swimming It’s been almost two years since I have been on and off with swimming. I have come a long way forward, however, not still to the point where I am comfortable to say I know swimming well.

    • Travel Two trips back to India took most of my vacation time. We also visited Bundaberg and Rockhampton - places within Brisbane. However, we did not make any other international trips.

    Goals for 2020

    • Reading Read 12 books - Bumping up two books from the last year’s challenge. I want to add more variety to the books.

    • Blogging and Youtube 3 posts and 3 videos every month. Try and build up a niche/specialization. It is something I have wanted to do for a long time but never happened.

    • Tri Sports Complete a Marathon. Focus more on swimming. Complete a training program on my Tacx Neo with Trainer Road.

    • Learning Learn about Containers and SAFE stack.

    I started with Bullet Journaling in 2019, and it has been helping me a lot with planning and organizing myself. I plan to use the same in 2020 and have got a new Journal, all ready to go.

    Wishing you all a Happy and Prosperous New Year!

    While playing around with the Windows Terminal, I had set up Aliasing to enable alias for commonly used commands.

    For e.g. Typing in s implies git status.

    I wanted to create new command aliases from the command line itself, instead of opening up the script file and modifying it manually. So I created a PowerShell function for it.

    $aliasFilePath = "<Alias file path>"
    
    function New-CommandAlias {
    param(
        [parameter(Mandatory=$true)]$CommandName,
        [parameter(Mandatory=$true)]$Command,
        [parameter(Mandatory=$true)]$CommandAlias
        )
    
        $functionFormat = "function $commandName { & $command $args }
    New-Alias -Name $commandAlias -Value $commandName -Force -Option AllScope"
    
        $newLine = [Environment]::NewLine
        Add-Content -Path $aliasFilePath -Value "$newLine$functionFormat"
    }
    
    . $aliasFilePath

    The script does override existing alias with the same name. Use the ‘Get-Alias’ cmdlet to find existing aliases.

    The above script writes a new function and maps it to the alias command using the existing New-Alias cmdlet

    function Get-GitStatus { & git status -sb $args }
    New-Alias -Name s -Value Get-GitStatus -Force -Option AllScope

    Add this to your PowerShell profile file (run notepad \$PROFILE) as we did for theming when we set up the windows terminal. In the above script, I write to the ‘\$aliasFIlePath’ and load all the alias from that file using the Dot sourcing operator.

    Below are a few sample usages

    New-CommandAlias -CommandName "Get-GitStatus" -Command "git status -sb" -CommandAlias "s"
    New-CommandAlias -CommandName "Move-ToWorkFolder" -Command "cd C:\Work\" -CommandAlias "mwf"

    The full gist is available here. I have tried adding only a couple of commands, and it did work fine. If you find any issues, please drop a comment.

    For a long time, I have been using the Cmder as my command line. It was mostly for the ability to copy-paste, open multiple tabs, and the ability to add aliases (shortcut command). I was never particularly interested in other customizations of the command line. However, one of these recent tweets made me explore the new Windows Terminal.

    Windows Terminal is a new, modern, feature-rich, productive terminal application for command-line users. It includes many of the features most frequently requested by the Windows command-line community, including support for tabs, rich text, globalization, configurability, theming & styling, and more.

    You can install using the command line itself or get it from the Windows Store. I prefer the Windows Store version as it gets automatically updated.

    Toggling

    Pressing WIN Key (WIndows Key) + # (the position of the app on the taskbar) works as toggle. If the app is open and selected, it will minimize, if not, it will bring to the front and selects it. If the app is not running, it will start the app.

    In my case, Windows Key + 1 launches Terminal, Windows Key + 2 launches Chrome, Windows Key + 3 launches Visual Studio and so on.

    Theming

    To theme the terminal, you need to install two PowerShell modules.

    Install-Module posh-git -Scope CurrentUser
    Install-Module oh-my-posh -Scope CurrentUser

    To load these modules by default on launching PowerShell, update the PowerShell profile. For this run ‘notepad $PROFILE’ from a PowerShell command line. Add the below lines to the end of the file and save. You can choose an existing theme or even make a custom one. You can further customize this as you want. Here is a great example to get started. I use the Paradox theme currently.

    Import-Module posh-git
    Import-Module oh-my-posh
    Set-Theme Paradox

    Restart the prompt, and if you see squares or weird-looking characters, you likely need some updated fonts. Head over to Nerd Fonts, where you can browse for them.

    Nerd Fonts patches developer targeted fonts with a high number of glyphs (icons). and gives all those cool icons in the prompt.

    To make windows Terminal use the new font, update the settings. Click the button with a down arrow right next to the tabs or use Ctrl + , shortcut. It opens the profiles.json setting file where you can update the font face per profile.

    "fontFace": "UbuntuMono NF",

    Aliasing

    I use the command line mostly for interacting with git repositories and like having shorter commands for commonly used commands, like git status, git commit, etc. My previous command-line, Cmder, had a feature to set alias. Similarly, in PowerShell, we can create a function to wrap the git command and then use the New-Alias cmdlet to create an alias. You can find a good list to start with here and modify them as you need. I have the list of alias in a separate file and load it in the Profile as below. Having it in Dropbox allows me to sync it across to multiple devices and have the same alias everywhere.

    Use the Dot sourcing operator to run the script in the current scope and make everything in the specified file added to the current scope.

    . C:\Users\rahul\Dropbox\poweshell_alias.ps1

    The alias does override any existing alias with the same name, so make sure that you use aliases that don’t conflict with anything that you already use. Here is the powershell_alias file that I use.

    I no longer use Cmder and enjoy using the new Terminal. I have just scratched the surface of the terminal here, and there are heaps more that you can format, customize, add other shells, etc.

    Enjoy the new Terminal!

    References:

    At work, we usually DbUp changes to SQL Server. We follow certain naming conventions when creating table constraints and Indexes. Here is an example

    create table Product
    (
      Id uniqueidentifier not null unique,
      CategoryId uniqueidentifier not null,
      VendorId uniqueidentifier not null,
    
      constraint PK_Product primary key clustered (Id),
      constraint FK_Product_Category foreign key (CategoryId) references Category (Id),
      constraint FK_Product_Vendor foreign key (VendorId) references Vendor (Id)
    )
    
    create index IX_Product_CategoryId on Product (CategoryId);

    I had to rename a table as part of a new feature. I could have just renamed the table name and moved on, but I wanted all the constraints and indexes also renamed to match the name convention. I could not find any easy way to do this and decided to script it.

    If you know of a tool that can do this, let know in the comments and stop reading any further 😄.

    Since I have been playing around with F# for a while, I chose to write it in that. The SQL Server Management Objects (SMO) provides a collection of objects to manage SQL Server programmatically, and it can be used from F# as well. Using the #I and #r directives, the SMO library path and DLL’s can be referred.

    #I @"C:\Program Files\Microsoft SQL Server\140\SDK\Assemblies\";;
    #I @"C:\Program Files (x86)\Microsoft SQL Server\140\SDK\Assemblies";;
    #r "Microsoft.SqlServer.Smo.dll";;
    #r "Microsoft.SqlServer.ConnectionInfo.dll";;
    #r "Microsoft.SqlServer.Management.Sdk.Sfc.dll";;

    The SMO object model is a hierarchy of objects with the Server as the top-level object. Given a server name, we can start navigating through the entire structure and interact with the related objects. Below is how we can narrow down to the table that we want to rename.

    let generateRenameScripts (serverName:string) (databaseName:string) (oldTableName:string) newTableName = 
        let server = Server(serverName)
        let db = server.Databases.[databaseName]
        let oldTable = db.Tables |> Seq.cast |> Seq.tryFind (fun (t:Table) -> t.Name = oldTableName)

    SMO does allow generating scripts programmatically, very similar to how SSMS allows to right-click on a table and generate relevant scripts. The ScriptingOptions class allows passing in various parameters determining the scripts generated. Below is how I create the drop and create scripts.

    let generateScripts scriptingOpitons (table:Table) = 
        let indexes = table.Indexes |> Seq.cast |> Seq.collect (fun (index:Index) -> (index.Script scriptingOpitons |> Seq.cast<string>)) 
        let fks = table.ForeignKeys |> Seq.cast |> Seq.collect (fun (fk:ForeignKey) -> fk.Script scriptingOpitons |> Seq.cast<string>)
        let all = Seq.concat [fks; indexes]
        Seq.toList all
    
    let generateDropScripts (table:Table) =
        let scriptingOpitons = ScriptingOptions(ScriptDrops = true, DriAll = true, DriAllKeys = true, DriPrimaryKey = true, SchemaQualify = false)
        generateScripts scriptingOpitons table
    
    let generateCreateScripts (table:Table) =
        let scriptingOpitons = ScriptingOptions( DriAll = true, DriAllKeys = true, DriPrimaryKey = true, SchemaQualify = false)
        generateScripts scriptingOpitons table

    For the create scripts, I do a string replace of the old table name with the new table name. The full gist is available here.

    Below is what the script generated for renaming the above table from ‘Product’ to ‘ProductRenamed’. This output can further be optimized, passing in the appropriate parameters to the ScriptingOptions class.

    let script = generateRenameScripts "(localdb)\\MSSQLLocalDB" "Warehouse" "Product" "ProductRenamed"
    File.WriteAllLines (@"C:\Work\Scripts\test.sql", script) |> ignore
    ALTER TABLE [Product] DROP CONSTRAINT [FK_Product_Category]
    ALTER TABLE [Product] DROP CONSTRAINT [FK_Product_Vendor]
    DROP INDEX [IX_Product_CategoryId] ON [Product]
    ALTER TABLE [Product] DROP CONSTRAINT [PK_Product] WITH ( ONLINE = OFF )
    ALTER TABLE [Product] DROP CONSTRAINT [UQ__Product__3214EC065B6D1E82]
    EXEC sp_rename 'Product', 'ProductRenamed'
    ALTER TABLE [ProductRenamed]  WITH CHECK ADD  CONSTRAINT [FK_ProductRenamed_Category] FOREIGN KEY([CategoryId])
    REFERENCES [Category] ([Id])
    ALTER TABLE [ProductRenamed] CHECK CONSTRAINT [FK_ProductRenamed_Category]
    ALTER TABLE [ProductRenamed]  WITH CHECK ADD  CONSTRAINT [FK_ProductRenamed_Vendor] FOREIGN KEY([VendorId])
    REFERENCES [Vendor] ([Id])
    ALTER TABLE [ProductRenamed] CHECK CONSTRAINT [FK_ProductRenamed_Vendor]
    CREATE NONCLUSTERED INDEX [IX_ProductRenamed_CategoryId] ON [ProductRenamed]
    (
      [CategoryId] ASC
    )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, SORT_IN_TEMPDB = OFF, DROP_EXISTING = OFF, ONLINE = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
    ALTER TABLE [ProductRenamed] ADD  CONSTRAINT [PK_ProductRenamed] PRIMARY KEY CLUSTERED 
    (
      [Id] ASC
    )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, SORT_IN_TEMPDB = OFF, IGNORE_DUP_KEY = OFF, ONLINE = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
    ALTER TABLE [ProductRenamed] ADD UNIQUE NONCLUSTERED 
    (
      [Id] ASC
    )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, SORT_IN_TEMPDB = OFF, IGNORE_DUP_KEY = OFF, ONLINE = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]

    One thing that is missing at the moment is renaming foreign key references from other tables in the database to this newly renamed table. The FSharp code is possible not at its best, and I still have a lot of influence from C#. If you have any suggestions making better sound off in the comments

    Hope this helps and makes it easy to rename a table and update all associated naming conventions.