Code Formatting is an essential aspect of writing code, and I did write about this a while back on introducing code formatting into a large code base. It’s not about what all rules you and your team use, it’s about sticking with the conventions and using them consistently. Code Formatting rules are best when applied automatically, and the developer does not need to do anything in particular about it.
Prettier is an opinionated code formatter, which supports multiple languages and editors and easy to get started. Getting set up is as easy as just installing the prettier package using yarn/npm. There are multiple points at which you can integrate Prettier - in your editor, pre-commit hook or CI environments.
Most of the IDE’s have plugins for Prettier which makes it easy to get it into the code right from the beginning. You might need to update your IDE settings to run prettier when you save a file. For VS Code I have to set editor.formatOnSave to true to turn on this behaviour.
As the title says, Prettier is opinionated, which is useful in many ways and removes much time wasted on unnecessary discussions. However, it does provide some configuration options. Check if it provides enough for you to call a meeting to decide on one of them :).
Write prettier code!
With more and more data breaches happening it is possible that your personal information and passwords are already compromised. If you have been lazy and reusing passwords (just like me until a while back) across multiple sites then it is good to check if your password is already compromised. It necessarily need not be one of your social media account or bank account that needs to be compromised for an attacker to get your credentials. If you have been reusing passwords across sites, it might be that one site where security is not given much importance for that gets breached, exposing your credentials to the attacker or anyone who has the breached data. Often hackers use this information to try and enumerate other sites, social network, bank logins to try and login assuming the behaviour of password reuse.
To check if you have been part of a data breach you can use the service haveibeenpwned. If you have been part of any data breaches, then it will show you the details. In addition to that, you can also use the Pwned Passwords list to check if the password that you use has been part of any data breaches. It’s good to change your password if you find yours in there. If you are worried about entering your password in haveibeenpwned site, the good thing is that it uses k-anonymity model, which means that your full password is not sent across the wire.
- Update your passwords on all sites that you use if you have been reusing passwords. If you don’t have much time to do this in one shot, you can do this incrementally as and when you next visit them.
- Make sure you have unique passwords for each of the site. A good password is one that you cannot remember. So if you are not using a Password Manager it’s a good idea to start using. If you don’t want to spend money on a password manager, you can always use a random password generator to generate one for you. Remembering that password might be hard, you could either write it down or save it in the browser (not that I am recommending it over getting a Password Manager, but better than reusing passwords).
While in Sydney I was lucky enough to have attended the first and second NDC Conferences. After moving up to Brisbane, did not think I could attend one of these soon. However, then comes a nice shorter version of NDC specific to Security - NDC Security. As the name suggests, this conference is particular to security-related topics with a 2-day workshop and 1-day conference, as was held in Gold Coast, Queensland.
Troy Hunt and Scott Helme ran two workshops and I attended Hack Yourself First by Troy. The workshop covers a wide range of topics and is perfect for anyone who is into web development. The best thing is that you only need to have a browser and Fiddler/Charles Proxy (depending on whether you are on Windows or Mac land). One of the interesting thing about the workshop is that it puts you first into the hackers perspective and forces you to exploit existing vulnerabilities in the sample site designed specifically for this. Once you can do this, we then look at ways of protecting ourselves against such exploits and other mechanisms involved.
The workshop highlights how easy it is to find and exploit vulnerabilities in applications. Some tools detect vulnerabilities and exploit them for you if you input a few details to them. You necessarily need not know the vulnerabilities itself or how exactly to exploit them. Such tools make it easy for people to use them on any website that is out there on the web. Combined with the power of search engines it makes it quite easy to make your site vulnerabilities to be easily discoverable.
There were six talks in total and below are the ones that I found interesting.
- Scott Helme Talk: CSP XXP STS PKP CAA ETC OMG WTF BBQ…
- Talk: Dependable Dependencies
- Everything is Cyber-broken
The whole web is on a journey towards making it more secure. So it is an excellent time to move on to HTTPS if you are not already. Even after enabling HTTPS, it is a good idea to make sure you have got all the appropriate security headers set. Making sure that the libraries that you depend on are patched and updated is equally essential. There are incidents of massive data breaches because of vulnerabilities in third-party libraries and not keeping them updated.
Functionality need not be the only reason to upgrade third-party libraries. There might be security vulnerabilities that are getting patched which is an equally good reason to update dependent packages
The harder thing is to keep track of the vulnerabilities that are getting reported and always checking back with your application’s dependencies. There is a wide range of tools that help make this easy and seamlessly integrate within the development workflow. It can be included as early as when a developer intends to include a library into the source code, or in the build pipeline or even for sites that are up and running. The earlier such issues get detected in the software development lifecycle, the less costly and impact it has on time and cost.
The conference ended with a good discussion between Troy and Scott on how everything is Cyber broken. It touches upon the value of Extended Validation (EV) Certificate and how CA’s are trying to push for them while browsers are more and more going away from them. It also touches on various proponents of HTTP and the wrong messages that are getting spread to a broader audience and also about certificate revocations and a lot more. It was a fun discussion and a great end to the three-day event.
Location and Food
NDC Security was held at QT Gold Coast, Queensland and well organized. Coffee and drinks were available all throughout the day with a barista on the last day (which was cool). Food was served at start, breaks, and lunch and was good. The conference rooms were great and spacious and had reasonable good internet. Did not face much connectivity issues and everything ran smoothly.
One of the things I first did after coming from the conference was to move this blog over to HTTPS. I had been procrastinating long on this, but there were enough reasons to make a move now. Also, there are a bunch of things that catch my eye at client places and other web sites that I visit often. Attending the conference and workshop has been a great value add and recommend to anyone if you have a chance to attend that. For the others, most of the content is available in Pluralsight.
PS: Special thanks to Readify for sending me to this conference and also providing a ‘paid vacation (accommodation)’ in Gold Coast. It was a nice three-day break for my wife and son also.
Two Factor Authentication (2FA) is becoming more and more common these days and is a good way to protect your accounts from getting into the wrong hands. SMS and App based 2FA are more common with the day to day services that we use, like Gmail, Outlook, Facebook etc. Enabling 2FA the user is prompted for a number that gets sent to them via phone or generated using an application, in addition to the username and password, when logging in. Enabling 2FA protects your account a level further. Even if an attacker has your credentials from a data breach, they would still need access to your phone to log in to your account. Using an app to generate the codes is more preferable than using SMS as it does not require internet connectivity or mobile service.
Until lately I have been using Google Authenticator to generate codes for all the accounts that I have 2FA enabled. The app does work well on a single mobile device but becomes a pain when you want to switch phones or lose the phone. You could potentially be locked out of your accounts if you lose the phone and don’t have the backup codes available.
Authy is one of the best-rated 2FA application which targets exactly the issues with Google Authenticator. It is easy to setup, can be secured via TouchId/Password, supports encrypted backups and syncs across multiple applications and devices. Once setup any code that you add to your app gets synced through Authy servers and is all encrypted and secured. Authy has applications for the mobile, desktop and also has a plugin for Chrome browser. You can also manage devices from the account and revoke a device if it gets lost or is not used anymore. Authy vs Google Authentication post covers in detail all the differences between the two and the advantages of using Authy.
Check out Authy and do setup 2FA if you are not already!
If you are here and reading this probably you have a website and is serving it over HTTP. If you are unsure of whether your site needs HTTPS or not, don’t think twice - YES, YOUR SITE NEEDS HTTPS.
If you are not convinced check out https://doesmysiteneedhttps.com/. One of the main reasons that I have seen (including me) why people have shied away from having HTTPS on sites was cost. And this post explains how to get HTTPS for free. But make sure you are getting it for the correct reasons and you know exactly what you are getting
HTTPS & SSL doesn't mean "trust this." It means "this is private." You may be having a private conversation with Satan.— Scott Hanselman (@shanselman) April 4, 2012
Depending on how you are hosting you could possibly take two routes to enable HTTPS on your site. Let’s look at them in detail.
Option 1 - Get your Certificate and Add to Your Host
If your hosting service already allows you to upload a custom domain certificate, but you were just holding back because of the extra cost of getting a certificate, then head over to Let’s Encrypt to get your free certificate. Again depending on your hosting provider and the level of access that you have on your web server, Let’s Encrypt has muliple ways on how you can get a certificate.
What does it cost to use Let’s Encrypt? Is it really free?
We do not charge a fee for our certificates. Let’s Encrypt is a nonprofit, our mission is to create a more secure and privacy-respecting Web by promoting the widespread adoption of HTTPS. Our services are free and easy to use so that every website can deploy HTTPS.
We require support from generous sponsors, grantmakers, and individuals in order to provide our services for free across the globe. If you’re interested in supporting us please consider donating or becoming a sponsor.
In some cases, integrators (e.g. hosting providers) will charge a nominal fee that reflects the administrative and management costs they incur to provide Let’s Encrypt certificates.
Option 2 - CloudFlare
If you are like me on a shared/cheaper hosting service it is more likely that your hosting plan does not support adding SSL certificates. You will be forced to upgrade to a higher plan to upload a certificate, which in turn will cost you more. In this case, you can use Cloudflare, to enable HTTPS for free.
Cloudflare provides lots of features for websites, but in our case, we are more interested in what the Free plan gives us. It gives us a Shared SSL Certificate and also added benefits of Global CDN.
Cloudflare acts as a reverse proxy between you and the server hosting this web page, which simply means that all requests now go through Cloudflare which in turn reaches out to the web server, if it cannot find a locally cached copy. So this also means that there are now reduced number of calls to the web server as Cloudflare would serve it from its cache if already available.
Shared SSL is what is more interesting for us as part of this blog post. What shared SSL gives us is free HTTPS for our website. We get a Domain Validated (DV) certificate, with a small catch. It is not issued to our domain but to a shared Cloudflare domain server (sni154817.cloudflaressl.com in my case). If you want a custom SSL certificate then you need to be on a paid plan.
Cloudflare supports multiple SSL settings - Off, Flexible SSL, Full SSL, Full SSL(Strict). Depending on how your host is setup you can choose one of the options. Since I am using Azure Web Apps to host, it supports https over *.azurewebsites.net subdomain. But since the certficate is not for my custom domain name (rahulpnath.com), I have set the SSL setting to Full SSL. Cloudfare in this case will connect over HTTPS but not validate the certificate. If your host does not support HTTPs connection (for free) you can use Flexible SSL.
You can also choose to enable Cloudflare with Full SSL(Strict) if you have followed Option 1 and have a custom SSL certificate for the domain. This will give you the added benefits that Cloudfare provides.
Enabling HSTS Preload
Now that you have HTTPS setup on your domain with either of the options above, we can see that the website is now accessible over HTTPS. However, when you make the very first request to the website, the request goes over HTTP which then redirects over to HTTPS, after which the communication happens over a secure channel. However, there is a risk where the very first request can be intercepted and cause undesired behaviour.
Trust on first use (TOFU), or trust upon first use (TUFU), is a security model used by client software which needs to establish a trust relationship with an unknown or not-yet-trusted endpoint.
By setting the STS(Strict-Transport-Security) header along with the preload directive, we can then add our domain to the HSTS Preload list. By adding your domain into this list it is literally getting hardcoded into source code of browsers (like for e.g Chrome here). So anytime a request is made to a site it is checked against this hardcoded list available in memory and if present the request goes as HTTPS from the very first. You can set all subdomains for your domain as well as HSTS preloaded. Make sure you have all subdomains are served over HTTPS so that you do not lock yourself out on those sites. You can find more details on HSTS here.
Now that the cost factor is out of making your site support HTTPS, is there anything else that is holding you back? If speed is a concern and it worries that encryption/decryption at both ends of communication is going to slow you down take a look at this post on HTTPS’ massive speed advantage. If you are still not convinced let me give it one last shot to get you on board. Going forward most modern browsers are going to default to the web as a secure place. So instead of the present positive visual security indicators, it would start showing warnings on pages served over HTTP. That means soon your sites would start showing Not Secure if you are not moving over to HTTPS.
I don’t see any reason why we should still be serving our sites over HTTP. As you can see I have moved over to the HTTPS and have added this domain to the preload list as well. Let’s make the web secure by default!
When I started this blog around nine years back, my only intention was to share technical posts. But over period of time I started writing about a variety of things including productivity tips that I found useful, travelogues, random thoughts, personal goals, blogging etc. One of the things that I have noticed is that a lot of people have been inspired by these various posts and photos that I post online and have triggered them to do similar things.
I’ve had my own similar inspirations to start the various things that I do today. Like for instance, I started running after being inspired by my friends, Satish, Suresh and Thiru. I reached out to them for various tips when I started running a year ago. From running I moved on to cycling and a bit of swimming after seeing my friend Rahul. For travel, my inspiration has been Arun Sudheendran and Deepak Suresh who do a fair bit of exploration. I tend to reach out to them for travel ideas and places to visit. Similarly, there have been inspirations from people that I have never met or met just once or twice.
Below is a transcript of a chat with one of my readers whom I have never met. It’s a great feeling to wake up to such messages and it boosts your own motivation to continue what you are doing.
Social media plays a great role in spreading information these days. When you see people in your own friend’s circle start doing things that you have always wanted to, it gives you an extra push to give it a try. There might be some people who feel you are sharing too much of things that don’t interest them. For those, there is always an option to un-follow, mute, filter etc. Don’t let that thought stop you from sharing things that you do.
Such a small act of sharing, even things that you might have seen someone do could add up and be of big impact to someone else, often referred to as the Butterfly Effect.
The Butterfly Effect: This effect grants the power to cause a hurricane in China to a butterfly flapping its wings in New Mexico. It may take a very long time, but the connection is real. If the butterfly had not flapped its wings at just the right point in space/time, the hurricane would not have happened. - Chaos Theory
Share things that you do, share positive things and inspire others!
Of late I have been working for multiple clients at the same time. Different clients have different development environments, which has forced me into using Virual Machines (VM’s) for my day to day work. I will cover my actual setup and new way of working using VM’s in a different post.
When working on VM’s I often have to switch to the host machine for email, chat and a few other programs that I have just on my host machine. Minimizing the VM host is time consuming and context breaking if you are working off a single screen. On a multi monitor setup you can always have VM on one screen and host on the other. This can still get tricky if you have more than one VM’s connected.
The Virtual Desktops feature in Windows 10 is of great help in this scenario. We can move between desktops using keyboard shortcuts (Ctrl + Win + Right/Left Arrow). But with the VM running on separate Virtual Desktop any key presses gets picked up by the VM operating system and not by the host. This means that you cannot use the keyboard shortcuts to switch host desktops from inside a VM. However you can move between desktops using the Four Finger swipe gesture on your touchpad (if that is supported). These swipe gestures are picked up only by the host machine OS, unlike the keyboard shortcuts. So even when you are inside a VM, doing the four finger swipe gesture tells the host OS to switch desktops. This allows you to easily navigate between VM’s running on different Virtual Desktops.
Hope this helps!
A while back I had written about various one-day trip options around Sydney. Here is a list of places that we traveled around Sydney during long weekend breaks with a day or two overnight stays.
Coffs Harbour is one of the places that I liked the most of all my trips. It’s been almost a year since I made my trip and the memories are still fresh. The beaches are great, especially the Jetty Beach. The rainforest walk in Dorrigo was the best I have had to date, especially because of the rain the night before. Coffs Harbour is perfect for a 3-4 days trip and there are a lot of places to visit around.
We headed off to Port Macquarie to celebrate Gauthams birthday. Gautham likes strawberries a lot which was why we chose Port Macquarie. Ricardoes Tomatoes & Strawberries is located just ten minutes north of Port Macquarie and provides a unique experience for picking your own strawberries. You can spend around 2-3 hours here and make sure you don’t miss the scones from the cafe. Port Macquarie is also a great place for whale watching and we headed off on an early morning trip to be with the whales. The boat ride (PortJet) in itself is an experience and to our luck, we were able to see around 3 whales up close. We also went to Dooragan National Park, Kattang, Perpendicular Point and Charles Hamsey lookout.
The Grand Pacific Drive makes a great one day trip as well as a multi-day trip for those who want to take their time along this stretch of land. Starting from Royal National Park and stretching all the way to Sapphire Coast, this makes a great drive with beautiful scenery and also a lot of places to visit. The Grand Pacific Drive site has all the details that you need to plan your trip. It also has a trip planner that makes planning easier. If you want to cover most of the places along the way during a single trip, it is best to give it 2-3 days. During my trip, I stopped over at Wollongong and only made my way till Kiama.
Just 90 minutes from Sydney by car, the Blue Mountains has a lot of attractions worth visiting, making it a good place for an extended weekend trip. Wentworth Falls, Echo Point, and Three Sisters are some of the popular lookouts. Scenic World offers some good rides and entertainment for kids. I liked the worlds steepest incline railway ride in particular. The entry tickets are a bit overpriced though.
Jenolan Caves is another one hour drive from the Blue Mountains and is a must-do. It’s great for people of all ages and if you have kids they will love it. Make sure you check the different cave options and choose one that fits the people in your group. Booking a spot in advance might help and make sure you arrive on time. The drive up there might be a bit slower so give enough buffer time before your cave walk starts.
Unlike Sydney, Canberra is a planned city and you can tell that from the moment you enter it. It’s a beautiful little city with lots of variety of things to visit. We started off with the Cockington Green Gardens followed by the National Dinosaur museum. You can spend almost half a day with these and try out the Hamlet, Food Trucks. The Parliment House and Australian War Memorial is also worth visiting. If you time your visit during September-October you can also see the Floriade - the tulip flower festival.
Nelson Bay, Hunter Valley, Orange, Port Stephens etc are some of the places that are on our list but could not make it yet. I moved over to Brisbane end of last year and not sure when I will have another chance to explore more around Sydney. But I have new places to look forward to now - Exploring Brisbane!
I was given a console application written in .NET Core 2.0 and asked to set up a continuous deployment pipeline using TeamCity and Octopus Deploy. I struggled a bit with some parts, so thought it’s worth putting together a post on how I went about it. If you have a better or different way of doing things, please shout out in the comments below.
At the end of this post, we will have a console application that is automatically deployed to a server and running, anytime a change is pushed to the associated source control repository.
Setting Up TeamCity
The first three build steps use the .NET CLI to Restore, Build and Publish the application. Thee three steps restore the dependencies of the project, builds it and publishes all the relevant DLL’s into the publish folder.
The published application now needs to be packaged for deployment. In my case, deployments are managed using Octopus Deploy. For .NET projects, the preferred way of packaging for Octopus is using Octopack. However, OctoPack does not support .NET Core projects. The recommendation is to either use dotnet pack or Octo.exe pack. Using the latter I have set up a Command Line build step to pack the contents of the published folder into a zip (.nupkg) file.
The NuGet package is published to the NuGet server used by Octopus. Using the Octopus Deploy: Create Release build step, a new release is triggered in Octopus Deploy.
Setting Up Octopus Deploy
For the Deploy Package step I have enabled Custom Deployment Scripts and JSON Configuration variables. Under the pre-deployment script, I stop any existing .NET applications. If multiple .NET applications are running on the box, select your application explicitly.
Once the package is deployed, the custom script starts up the application.
With all that set, any time a change is pushed into the source control repository, TeamCity picks that up, build and triggers a deployment to the configured environments in Octopus Deploy. Hope this helps!
Often when working with SQL queries, I come across the need to capitalize SQL keywords across in a large query. For, e.g., to capitalize SELECT, WHERE, FROM clauses in an SQL query. When it is a large query/stored procedure, it is faster done using some text editor. Sublime Text is my preferred editor for such kind of text manipulations.
Sublime Text Editor comes with a few built-in text casing converters that we can use, to convert text from one case to another. Using the simultaneous editing feature, we can combine it with case conversion and manipulate large documents easily.
For example, let’s say I have this below SQL query. As you can notice the SELECT and FROM keywords are cased differently across the query.
1 2 3 4 5
To standardize this (preferably capitalize all), highlight one of the ‘select’ keywords and highlight all occurrences of the keyword (ALT + F3). Once all occurrences of ‘select’ is highlighted, bring up the command pallete (CTRL + SHIFT + P on windows) and search for ‘Convert Case’. From the options listed choose the case that you want to convert. All selected occurrences of the keyword will now be in the selected case.
Hope this helps you when you have a lot of text case manipulations to be done.