When I started this blog around nine years back, my only intention was to share technical posts. But over period of time I started writing about a variety of things including productivity tips that I found useful, travelogues, random thoughts, personal goals, blogging etc. One of the things that I have noticed is that a lot of people have been inspired by these various posts and photos that I post online and have triggered them to do similar things.

Share and Inspire, Image Source https://www.saylor.org/2015/04/blog-saylor-student-stories/

I’ve had my own similar inspirations to start the various things that I do today. Like for instance, I started running after being inspired by my friends, Satish, Suresh and Thiru. I reached out to them for various tips when I started running a year ago. From running I moved on to cycling and a bit of swimming after seeing my friend Rahul. For travel, my inspiration has been Arun Sudheendran and Deepak Suresh who do a fair bit of exploration. I tend to reach out to them for travel ideas and places to visit. Similarly, there have been inspirations from people that I have never met or met just once or twice.

Below is a transcript of a chat with one of my readers whom I have never met. It’s a great feeling to wake up to such messages and it boosts your own motivation to continue what you are doing.

Share and Inspire

Social media plays a great role in spreading information these days. When you see people in your own friend’s circle start doing things that you have always wanted to, it gives you an extra push to give it a try. There might be some people who feel you are sharing too much of things that don’t interest them. For those, there is always an option to un-follow, mute, filter etc. Don’t let that thought stop you from sharing things that you do.

Such a small act of sharing, even things that you might have seen someone do could add up and be of big impact to someone else, often referred to as the Butterfly Effect.

The Butterfly Effect: This effect grants the power to cause a hurricane in China to a butterfly flapping its wings in New Mexico. It may take a very long time, but the connection is real. If the butterfly had not flapped its wings at just the right point in space/time, the hurricane would not have happened. - Chaos Theory

Share things that you do, share positive things and inspire others!

Of late I have been working for multiple clients at the same time. Different clients have different development environments, which has forced me into using Virual Machines (VM’s) for my day to day work. I will cover my actual setup and new way of working using VM’s in a different post.

When working on VM’s I often have to switch to the host machine for email, chat and a few other programs that I have just on my host machine. Minimizing the VM host is time consuming and context breaking if you are working off a single screen. On a multi monitor setup you can always have VM on one screen and host on the other. This can still get tricky if you have more than one VM’s connected.

The Virtual Desktops feature in Windows 10 is of great help in this scenario. We can move between desktops using keyboard shortcuts (Ctrl + Win + Right/Left Arrow). But with the VM running on separate Virtual Desktop any key presses gets picked up by the VM operating system and not by the host. This means that you cannot use the keyboard shortcuts to switch host desktops from inside a VM. However you can move between desktops using the Four Finger swipe gesture on your touchpad (if that is supported). These swipe gestures are picked up only by the host machine OS, unlike the keyboard shortcuts. So even when you are inside a VM, doing the four finger swipe gesture tells the host OS to switch desktops. This allows you to easily navigate between VM’s running on different Virtual Desktops.

Hope this helps!

A while back I had written about various one-day trip options around Sydney. Here is a list of places that we traveled around Sydney during long weekend breaks with a day or two overnight stays.

TLDR;

Coffs Harbour

Coffs Harbour is one of the places that I liked the most of all my trips. It’s been almost a year since I made my trip and the memories are still fresh. The beaches are great, especially the Jetty Beach. The rainforest walk in Dorrigo was the best I have had to date, especially because of the rain the night before. Coffs Harbour is perfect for a 3-4 days trip and there are a lot of places to visit around.

Jetty Beach, Coffs Harbour

Port Macquarie

We headed off to Port Macquarie to celebrate Gauthams birthday. Gautham likes strawberries a lot which was why we chose Port Macquarie. Ricardoes Tomatoes & Strawberries is located just ten minutes north of Port Macquarie and provides a unique experience for picking your own strawberries. You can spend around 2-3 hours here and make sure you don’t miss the scones from the cafe. Port Macquarie is also a great place for whale watching and we headed off on an early morning trip to be with the whales. The boat ride (PortJet) in itself is an experience and to our luck, we were able to see around 3 whales up close. We also went to Dooragan National Park, Kattang, Perpendicular Point and Charles Hamsey lookout.

Whale watching and Strawberry picking, Port Macquarie

Grand Pacific Drive

The Grand Pacific Drive makes a great one day trip as well as a multi-day trip for those who want to take their time along this stretch of land. Starting from Royal National Park and stretching all the way to Sapphire Coast, this makes a great drive with beautiful scenery and also a lot of places to visit. The Grand Pacific Drive site has all the details that you need to plan your trip. It also has a trip planner that makes planning easier. If you want to cover most of the places along the way during a single trip, it is best to give it 2-3 days. During my trip, I stopped over at Wollongong and only made my way till Kiama.

Grand Pacific Drive

Blue Mountains

Just 90 minutes from Sydney by car, the Blue Mountains has a lot of attractions worth visiting, making it a good place for an extended weekend trip. Wentworth Falls, Echo Point, and Three Sisters are some of the popular lookouts. Scenic World offers some good rides and entertainment for kids. I liked the worlds steepest incline railway ride in particular. The entry tickets are a bit overpriced though.

Three Sisters, Blue Mountains

Jenolan Caves is another one hour drive from the Blue Mountains and is a must-do. It’s great for people of all ages and if you have kids they will love it. Make sure you check the different cave options and choose one that fits the people in your group. Booking a spot in advance might help and make sure you arrive on time. The drive up there might be a bit slower so give enough buffer time before your cave walk starts.

Canberra

Unlike Sydney, Canberra is a planned city and you can tell that from the moment you enter it. It’s a beautiful little city with lots of variety of things to visit. We started off with the Cockington Green Gardens followed by the National Dinosaur museum. You can spend almost half a day with these and try out the Hamlet, Food Trucks. The Parliment House and Australian War Memorial is also worth visiting. If you time your visit during September-October you can also see the Floriade - the tulip flower festival.

Floriade Tulip Festival, Canberra

Nelson Bay, Hunter Valley, Orange, Port Stephens etc are some of the places that are on our list but could not make it yet. I moved over to Brisbane end of last year and not sure when I will have another chance to explore more around Sydney. But I have new places to look forward to now - Exploring Brisbane!

I was given a console application written in .NET Core 2.0 and asked to set up a continuous deployment pipeline using TeamCity and Octopus Deploy. I struggled a bit with some parts, so thought it’s worth putting together a post on how I went about it. If you have a better or different way of doing things, please shout out in the comments below.

At the end of this post, we will have a console application that is automatically deployed to a server and running, anytime a change is pushed to the associated source control repository.

Setting Up TeamCity

Create a New Project and add a new build configuration just like you would for any other project. Since the application is in .NET Core, install the .NET CLI plugin on the TeamCity server.

Build Steps to build .Net Core

The first three build steps use the .NET CLI to Restore, Build and Publish the application. Thee three steps restore the dependencies of the project, builds it and publishes all the relevant DLL’s into the publish folder.

The published application now needs to be packaged for deployment. In my case, deployments are managed using Octopus Deploy. For .NET projects, the preferred way of packaging for Octopus is using Octopack. However, OctoPack does not support .NET Core projects. The recommendation is to either use dotnet pack or Octo.exe pack. Using the latter I have set up a Command Line build step to pack the contents of the published folder into a zip (.nupkg) file.

1
octo pack --id ApplicationName --version %build.number% --basePath published-app

The NuGet package is published to the NuGet server used by Octopus. Using the Octopus Deploy: Create Release build step, a new release is triggered in Octopus Deploy.

Setting Up Octopus Deploy

Create a new project in Octopus Deploy to manage deployments. Under the Process tab, I have two steps - one to deploy the Package and another to start the application.

Octopus Deploy Process Steps

For the Deploy Package step I have enabled Custom Deployment Scripts and JSON Configuration variables. Under the pre-deployment script, I stop any existing .NET applications. If multiple .NET applications are running on the box, select your application explicitly.

Pre Deployment Script
1
Stop-Process -Name dotnet -Force -ErrorAction SilentlyContinue

Once the package is deployed, the custom script starts up the application.

Run App
1
2
cd C:\DeploymentFolder
Start-Process dotnet .\ApplicationName.dll

With all that set, any time a change is pushed into the source control repository, TeamCity picks that up, build and triggers a deployment to the configured environments in Octopus Deploy. Hope this helps!

Often when working with SQL queries, I come across the need to capitalize SQL keywords across in a large query. For, e.g., to capitalize SELECT, WHERE, FROM clauses in an SQL query. When it is a large query/stored procedure, it is faster done using some text editor. Sublime Text is my preferred editor for such kind of text manipulations.

Sublime Text Editor comes with a few built-in text casing converters that we can use, to convert text from one case to another. Using the simultaneous editing feature, we can combine it with case conversion and manipulate large documents easily.

Convert case options in sublime text

For example, let’s say I have this below SQL query. As you can notice the SELECT and FROM keywords are cased differently across the query.

1
2
3
4
5
select * From Table1
select * From Table2
select * From Table3
SELECT * FROM Table4
Select * From Table5

To standardize this (preferably capitalize all), highlight one of the ‘select’ keywords and highlight all occurrences of the keyword (ALT + F3). Once all occurrences of ‘select’ is highlighted, bring up the command pallete (CTRL + SHIFT + P on windows) and search for ‘Convert Case’. From the options listed choose the case that you want to convert. All selected occurrences of the keyword will now be in the selected case.

Hope this helps you when you have a lot of text case manipulations to be done.

Posts per month - 2016

A year has gone by so fast, and it is again time to do a year review.

TLDR;

2017 was the transformation year. Regular exercise and healthy eating helped loose around 20 kilos. Lots of travel and blogging made it an excellent year. Reading, Photography and Open source did not go that great. Looking forward to 2018!

What went well

Blogging

It has been both good and a bad year as far as this blog. Including the ‘Tip of the Week’ series I wrote seventy-six posts this year with an average of over six posts per month. This is the good part, as it is well past the goal of a minimum four posts a month goal set last year. But looking at the actual posts per month graph below, it is clear that I have fallen short of it on an actual month by month basis. Up until August, I had a steady stream of posts coming in, from when it started dropping down, with even months (November) with no posts. Mainly it’s my laziness to blame, but I can also tell reasons like Vietnam trip, Shifting to Brisbane, etc.

Posts per month in the year 2017

Running

I had started running towards the end of December 2016. One of my goals for 2017 was running, and it has had a good improvement. Ran over 750 kilometers including a half marathon. I am yet to participate in any running events and am planning to in the coming year. I have also started cycling, and it is an excellent way to cross train.

Year in Sport

Travel

Did our first international holiday to Vietnam for ten days and was a great experience. Also went around Australia visiting Blue Mountains, Canberra, Port Macquarie, Brisbane and lots of one-day trips around Sydney. Mandarin Picking, Strawberry picking and Whale watching were some of the top activities for the year.

Strawberry Picking, Ricardoes

What didn’t go well

  • Reading Had set out with a goal of 21 books but ended up finishing only ten books. Of the books I read liked Mindset and How to Win Friends and Influence People, was the best.

  • Photography One trip every three months and post photos were one another goal. The travel part went good (see above), but my DSLR always remained in the bag. Except for a few pictures on the phone camera, there was not much photography done.

  • FSharp FSharp was again on and off this year. Apart from a small utility that I created for Todoist, I did not do much F# work.

Goals for 2018

  • Blogging Stick to 4 posts a month. Need to get back on schedule.

  • Running Attend few running events. Run a marathon.

  • Swimming Having started cycling along with running, has got me thinking about a triathlon. The only thing between is swimming, and I have no clue how to swim. Learning to swim is one of the key goals for the upcoming year. Target is to be able to swim one km.

  • Open Source Start working on a side project. Need to find a matching project first.

  • Reading Read 15 books

Wishing you all a Happy and Prosperous New Year!

I recently upgraded to a Garmin Fenix 3 HR from my Forerunner 630. After a few runs with the Fenix 3, I realized that in Training Mode it does not do auto lap. I have a custom training workout for a 10k with no repeat modes in it. This workout was what I used on my FR630, and it used to auto lap at 1km. That no longer happens in the Fenix 3 HR.

Software Details

Fenix 3 HR: 4.70

Forerunner 630: 7.50(bdd586f)

Garming Auto Lap Not working in Training Mode, Fenix 3 HR

After googling around, I understood the auto lap under training mode is a feature only available to specific models/software versions. One of the reasoning behind it is auto lap might create issues if people are training in intervals larger than 1km. Breaking into laps at every 1km will make it harder/nearly impossible to compare their intervals. For workouts that you want auto lap at 1km (or at any custom distances), you can use Repeat feature as shown below. Setting up the workout as 10 x 1km helps to analyze the run at 1km intervals.

Garming Auto Lap Using Repeat, Fenix 3 HR

Depending on the model/software version of your Garmin watch you might have to tweak your workout plans. Hope this helps

Scheduling

At one of my clients, they had a requirement of scheduling various rules to sent our alert messages via SMS, Email, etc. A Rule consists of below and a few other properties

  • Stored Procedure: The Stored Procedure (yes you read it correctly) to check if an alert needs to be raised
  • Polling Interval: The time interval in which a Rule needs to be checked.
  • Cool-Off Period: Time to wait before running Rule again after an alert was raised.

All Rules are stored in a database. New rules can be added and existing ones updated via an external application. Since the client is not yet in the Cloud, using any of Azure Functions, Lambda, Web Jobs, etc. are out of the question. It needs to be a service running on-premise, so I decided to keep it as a Windows service.

1
2
3
4
5
6
7
8
9
 public class Rule
{
    public int Id { get; set; }
    public string Name { get; set; }
    public string StoredProc { get; set; }
    public TimeSpan PollingInterval { get; set; }
    public TimeSpan CoolOffPeriod { get; set; }
    ...
}

Because of my past good experiences with HangFire I initially set off using that only to discover soon that it can schedule jobs only to the minute level. Even though this is a feature that has been discussed for a long time, it’s yet to be implemented. Since some of the rules are critical to the business, they want to be notified as soon as possible. This means having a polling interval in seconds for those rules.

After reaching out to my friends at Readify, I decided to use Quartz.net. Many had good experiences using it in the past and recommended it highly. One another option that came up was FluentScheduler. There was no particular reason to go with Quartz.net.

Quartz.NET is a full-featured, open source job scheduling system that can be used from smallest apps to large-scale enterprise systems.

Setting up and getting started with Quartz scheduler is fast and easy. The library has a well-written documentation. You can update the applications configuration file to tweak various attributes of the scheduler.

App/Web.config file
1
2
3
4
5
6
7
8
9
<configuration>
  <configSections>
    <section name="quartz" type="System.Configuration.NameValueSectionHandler, System, Version=1.0.5000.0,Culture=neutral, PublicKeyToken=b77a5c561934e089" />
  </configSections>
  <quartz>
    <add key="quartz.scheduler.instanceName" value="TestScheduler" />
    <add key="quartz.jobStore.type" value="Quartz.Simpl.RAMJobStore, Quartz" />
  </quartz>
</configuration>

The RAMJobStore indicates the store to use for storing job. There are other job stores available if you want persistence of jobs anytime the application restarts.

Setting Up Jobs

Basically, there are three jobs - Alert Job, CoolOff Job, and Refresh Job - set up for the whole application. The Alert and Refresh Jobs are scheduled on application start. The CoolOff Job is triggered by the Alert Job as required. Any data that is required by the job is passed in using JobDataMap.

Schedule an Alert Job
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
...
var job = JobBuilder.Create<AlertJob>()
    .WithIdentity(rule.GetJobKey())
    .WithDescription(rule.Name)
    .SetJobData(rule)
    .Build();

var trigger = TriggerBuilder
    .Create()
    .WithIdentity(rule.GetTriggerKey())
    .StartNow()
    .WithSimpleSchedule(a => a
        .WithIntervalInSeconds((int)rule.PollingInterval.TotalSeconds)
        .RepeatForever())
    .Build();

scheduler.ScheduleJob(job, trigger);

Alert Jobs

The Alert Job is responsible for checking the stored procedure and sending the alerts if required. If an alert is sent, it starts the CoolOff Job and pauses the current job instance. THe DisallowConcurrentExecution prevents multiple instances of the Job having the same key does not execute concurrently. We explicitly set the Job Key based on the Rule Id. This prevents any duplicate messages getting sent out if any of the job instances takes more time to execute than its set polling interval.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
[DisallowConcurrentExecution]
public class AlertJob : Job
{
    public void Execute(IJobExecutionContext context)
    {
        var alert = context.GetRuleFromJobData();
        var message = GetAlertMessage(alert);
        if(message != null)
        {
            SendMessage(message);
            CoolOff(alert);
        }
    }

    public void CoolOff(Rule rule)
    {
        var job = JobBuilder.Create<CoolOffJob>()
            .WithIdentity(jobKey)
            .WithDescription(rule.MessageTitle)
            .SetJobData(rule)
            .Build();

        var trigger = TriggerBuilder
            .Create()
            .WithIdentity(rule.GetCoolOffTriggerKey())
            .StartAt(rule.GetCoolOffDateTimeOffset())
            .Build();

        scheduler.PauseJob(rule.GetJobKey());
        scheduler.ScheduleJob(job, trigger);
    }
    ...
}

Cool-Off Job

Cool-Off Jobs is a one time job scheduled by the Alert Job after an alert is sent successfully. The CoolOff job is scheduled to start after the Cool-Off time as configured for the alert. This triggers the job only after the set amount of time. It Resumes the original Rule Job to continue execution.

1
2
3
4
5
6
7
8
public class CoolOffJob : IJob
{
    public void Execute(IJobExecutionContext context)
    {
        var alert = context.GetRuleFromJobData();
        ScheduleHelper.ResumeJob(alert);
    }
}

Refresh Job

The Refresh Job is a recurring job, that polls the database for any changes to the Rules themselves If any change is detected,it removes the existing schedules for the alert and adds the updated alert job.

1
2
3
4
5
6
7
8
9
[DisallowConcurrentExecution]
public class RefreshJob : IJob
{
    public void Execute(IJobExecutionContext context)
    {
        var allRules = GetAllRules();
        ScheduleHelper.RefreshRules(allRules);
    }
}

With these three jobs, all the rules get scheduled at the start of the application and run continuously. Anytime a change is made to the rule itself, the Refresh Job refreshes it within the time interval that it is scheduled for.

Tip:If there are a lot of rules with the same polling interval it will be good to stagger their starting time using a delayed start per job instance. Doing that will make sure that all jobs do not get polled for at the same time.

So far I have found the Quartz library stable and reliable and have not faced any issues with it. The library is also quite flexible and adapts well to the different needs.

Hope this helps. Merry Xmas!

I was recently playing around with MessageMedia API trying to send SMS and get the status of the SMS sent. Sending the SMS and getting the status of the last sent SMS always happened in succession when testing it manually. Once I send the message, I waited for the API response, grabbed the message id from the response and used that to form the get status request.

Postman is a useful tool if you are building or testing APIs. It allows to create, send and manage API requests.

Postman Chaining Requests

I added two requests and saved it to a collection in Postman - one to Send Message and other to Get Message status. I have created an environment variable for holding the message id. For the request that sends a message, the below Test snippet is added. It parses the response body of the request and extracts the message id of the last send message. This is then saved to the environment variable. The Test snippet is always run after performing the request.

1
2
3
var jsonData = JSON.parse(responseBody);
postman.setEnvironmentVariable("messageId", jsonData.messages[0].message_id);
tests["Success"]= true;

The Get message request uses the messageId from the environment variables to construct its URL. The URL looks like below.

1
https://api.messagemedia.com/v1/messages/{{messageId}}

When executing this request, it fetches the messageId from the environment variable, which is set by the previous request. You no longer have to copy message id manually and use it in the URL. This is how we chain the data from one request to another request. Chaining requests is also useful in automated testing using Postman. Hope this helps!

At times you might need to extract data from a large text. Let’s say you have a JSON response, and you want to extract all the id fields in the response and combine them as comma separated. Here’s how you can easily extract data from large text using Sublime (or any other text editor that supports simultaneous editing).

https://jsonplaceholder.typicode.com/posts
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
[
  {
    "userId": 1,
    "id": 1,
    "title": "sunt aut facere repellat provident occaecati excepturi optio reprehenderit",
    "body": "quia et suscipit\nsuscipit recusandae consequuntur expedita et cum\nreprehenderit molestiae ut ut quas totam\nnostrum rerum est autem sunt rem eveniet architecto"
  },
  {
    "userId": 1,
    "id": 2,
    "title": "qui est esse",
    "body": "est rerum tempore vitae\nsequi sint nihil reprehenderit dolor beatae ea dolores neque\nfugiat blanditiis voluptate porro vel nihil molestiae ut reiciendis\nqui aperiam non debitis possimus qui neque nisi nulla"
  },
  ...
]

Again the key here is to select the recurring pattern first. In this case, it is “id”: and then selecting all occurrences of that. Once all occurrences are selected, we can select the whole line and extract that out. Repeat the same to remove the id text. Then follow the same steps we used to combine text.

Hope this helps you to extract data from large text files.