How to Enable and Use GCC Strict Mode Compilation

One of the great feature that many C++ programmers rarely use is GCC strict mode compilation. Enabling this lets compiler warn you about any potential issues that might often get unnoticed in build noise. Unfortunately there is little documentation, let alone quick tutorial on this subject so I thought to write this up.

First, let's clear this up: There is no official GCC mode called "strict". I just made that term up. Fortunately there are enough compiler options that you can rig up to create "strict" mode that is often available in many other languages.

To get the "strict" mode, I use following command line options for gcc/g++. Below are written in format consumable in CMakeList.txt but you can use same options from pretty much anywhere.

set(CMAKE_CXX_FLAGS "-std=c++11 -Wall -Wextra  -Wstrict-aliasing -pedantic -fmax-errors=5 -Werror -Wunreachable-code -Wcast-align -Wcast-qual -Wctor-dtor-privacy -Wdisabled-optimization -Wformat=2 -Winit-self -Wlogical-op -Wmissing-include-dirs -Wnoexcept -Wold-style-cast -Woverloaded-virtual -Wredundant-decls -Wshadow -Wsign-promo -Wstrict-null-sentinel -Wstrict-overflow=5 -Wswitch-default -Wundef -Wno-unused -Wno-variadic-macros -Wno-parentheses -fdiagnostics-show-option ${CMAKE_CXX_FLAGS}")

That's a looong list of compiler options so now I hope you can agree that we really mean "strict" business here :). In essence it enables extra warnings and makes all warnings as errors, points out coding issues that borderlines on pedantic and then on top of that enables some more warnings. Rest assured, above is not an overkill. You are going to thank compiler for taking care of these stuff as your code base becomes larger and more complex.

Unfortunately, road from here has lots of twist and turns. The first thing that might happen to you is that you will get tons of errors, most likely not from your own code but from the included headers that you don't own! Because of the way C++ works, other people's bad code in their included header becomes your liability. Except for Boost and standard library, I haven't found many packages that can get through strict mode compilation. Even for relatively nicely written packages such as ROS you will get tons of compiler errors and for badly written packages such as DJI SDK, forget about it. Right... So now what?

Here's the fix I have used with fair amount of success. First, declare these two macros in some common utility file you have in your project:

#define STRICT_MODE_OFF                                                                 \ 
    _Pragma("GCC diagnostic push")                                            \
    _Pragma("GCC diagnostic ignored \"-Wreturn-type\"")             \
    _Pragma("GCC diagnostic ignored \"-Wdelete-non-virtual-dtor\"") \
    _Pragma("GCC diagnostic ignored \"-Wunused-parameter\"")        \
    _Pragma("GCC diagnostic ignored \"-pedantic\"")                 \
    _Pragma("GCC diagnostic ignored \"-Wshadow\"")                  \
    _Pragma("GCC diagnostic ignored \"-Wold-style-cast\"")          \
    _Pragma("GCC diagnostic ignored \"-Wswitch-default\"")

/* Addition options that can be enabled 
    _Pragma("GCC diagnostic ignored \"-Wpedantic\"")                \
    _Pragma("GCC diagnostic ignored \"-Wformat=\"")                 \
    _Pragma("GCC diagnostic ignored \"-Werror\"")                   \
    _Pragma("GCC diagnostic ignored \"-Werror=\"")                  \
    _Pragma("GCC diagnostic ignored \"-Wunused-variable\"")         \
#define STRICT_MODE_ON                                                                  \
    _Pragma("GCC diagnostic pop")          

Here we have two macros, one tells GCC to turn off selected warnings before some chunk of code and second tells GCC to re-enable it. Why can't we just turn off all strict mode warnings at once? Because GCC currently doesn't have that option. You must list every individual warning :(. Above list is something I just put together while dealing with ROS and DJI SDK and is obviously incomplete. Your project might encounter more stuff in which case you will need to keep adding in to above list. Another issue you might encounter is that GCC currently doesn't support suppressing every possible warnings! Yes, a big oops there. One of them that I recently encountered in DJI SDK was this:

warning: ISO C99 requires rest arguments to be used

The only way out for me in this case was to modify DJI's source code and submit the issue to them so hopefully they will fix it in next release.

Once you have above macros, you can place them around problematic headers. For example,

#include <string>
#include <vector>

#include <ros/ros.h>
#include <actionlib/server/simple_action_server.h>
#include <dji_sdk/dji_drone.h>

#include "mystuff.hpp"

We are not out of the water yet because above trick will work only for some header files. The reason is that GCC sometime doesn't compile entire file as soon as it encounters #include statement. So it's pointless to put macros around those #include statements. Solving those issues requires some more work, and in some cases a lot more work. The trick I used was to create wrappers around things you use from bad headers such that only those wrappers needs to use #include <BadStuff.h> statements and rest of your code doesn't need those header. Then you can disable strict mode for the wrappers and rest of your code remains clean. To do this, you would need to implement pimpl pattern in your wrapper classes so that all objects in BadStuff.h are behind opaque member. Notice that #include <BadStuff.h> statements would be in your wrapper.cpp file, not wrapper.hpp file.

Even though this might require significant work in big project, it's often worth it because you are clearly separating interface and dependency for the external stuff. Your own code then remains free of #include <BadStuff.h>. This will enable you to do even more things like static code analysis just for your code. In either case, consider contributing to those project with bad stuff and make them pass strict compilation!

So as it happens, working strict mode requires buy off from C++ community. If everyone isn't doing it then it becomes hard for others. So, tell everyone and start using yourself today!

Downloading All of Hacker News Posts and Comments


There are two files that contains all stories and comments posted at Hacker News from its start in 2006 to May 29, 2014 (exact dates are below). This was downloaded using simple program available I wrote Hacker News Downloader by making REST API calls to HN's official APIs. The program used API parameters to paginate through created date of items to retrieve all posts and comments. The file contains entire sequence of JSON responses exactly as returned by API call in JSON array.


Contains all the stories posted on HN from Mon, 09 Oct 2006 18:21:51 GMT to Thu, 29 May 2014 08:25:40 GMT.

Total count


File size

1.2GB uncompressed, 115MB compressed

How was this created

I wrote a small program Hacker News Downloader to create these files, available at Github.


Entire file is JSON compliant array. Each element in array is json object that is exactly the response that returned by HN Algolia REST API. The property named `hits` contains the actual list of stories. As this file is very large we recommend json parsers that can work on file streams instead of reading entire data in memory.

	"hits": [{
		"created_at": "2014-05-31T00:05:54.000Z",
		"title": "Publishers withdraw more than 120 gibberish papers",
		"url": "",
		"author": "danso",
		"points": 1,
		"story_text": "",
		"comment_text": null,
		"num_comments": 0,
		"story_id": null,
		"story_title": null,
		"story_url": null,
		"parent_id": null,
		"created_at_i": 1401494754,
		"_tags": ["story",
		"objectID": "7824727",
		"_highlightResult": {
			"title": {
				"value": "Publishers withdraw more than 120 gibberish papers",
				"matchLevel": "none",
				"matchedWords": []
			"url": {
				"value": "",
				"matchLevel": "none",
				"matchedWords": []
			"author": {
				"value": "danso",
				"matchLevel": "none",
				"matchedWords": []
			"story_text": {
				"value": "",
				"matchLevel": "none",
				"matchedWords": []
	"nbHits": 636094,
	"page": 0,
	"nbPages": 1000,
	"hitsPerPage": 1,
	"processingTimeMS": 5,
	"query": "",
	"params": "advancedSyntax=true\u0026analytics=false\u0026hitsPerPage=1\u0026tags=story"


Contains all the comments posted on HN from Mon, 09 Oct 2006 19:51:01 GMT to Fri, 30 May 2014 08:19:34 GMT.

Total count


File size

9.5GB uncompressed, 862MB compressed

How was this created

I wrote a small program Hacker News Downloader to create these files, available at Github.


Entire file is JSON compliant array. Each element in array is json object that is exactly the response that returned by HN Algolia REST API. The property named `hits` contains the actual list of stories. As this file is very large we recommend json parsers that can work on file streams instead of reading entire data in memory.

	"hits": [{
		"created_at": "2014-05-31T00:22:01.000Z",
		"title": null,
		"url": null,
		"author": "rikacomet",
		"points": 1,
		"story_text": null,
		"comment_text": "Isn\u0026#x27;t the word dyes the right one to use here? Instead of dies?",
		"num_comments": null,
		"story_id": null,
		"story_title": null,
		"story_url": null,
		"parent_id": 7821954,
		"created_at_i": 1401495721,
		"_tags": ["comment",
		"objectID": "7824763",
		"_highlightResult": {
			"author": {
				"value": "rikacomet",
				"matchLevel": "none",
				"matchedWords": []
			"comment_text": {
				"value": "Isn\u0026#x27;t the word dyes the right one to use here? Instead of dies?",
				"matchLevel": "none",
				"matchedWords": []
	"nbHits": 1371364,
	"page": 0,
	"nbPages": 1000,
	"hitsPerPage": 1,
	"processingTimeMS": 8,
	"query": "",
	"params": "advancedSyntax=true\u0026analytics=false\u0026hitsPerPage=1\u0026tags=comment"

Where to download

As GitHub restricts each file to be only 100MB and also has policies against data ware housing, these files are currently hosted at Unfortunately FileDropper currently shows ads with misleading download link so be careful on what link you click. Below is the screenshot FileDropper shows and currently the button marked in red would download the actual file.


HN Stories Download URL

Using Browser:

Using Torrent Client: magnet link (thanks to @saturation)

Archived at: Internet Archive (thanks to Bertrand Fan)

HN Comments Download URL

Using Browser:

Using Torrent Client: magnet link (thanks to @saturation)

Archived at: Internet Archive (thanks to Bertrand Fan)

Few points of interests

  • API rate limit is 10,000 requests per hour or you get blacklisted. I tried to be even more conservative by putting 4 sec of sleep between calls.
  • I like to keep entire response from the call as-is. So return value of this function is used to stream a serialized array of JSON response objects to a file.
  • As the output files are giant JSON files, you will need a JSON parser that can use streams. I used JSON.NET which worked out pretty well. You can find the sample code in my Github repo.
  • In total 1.3M stories and 5.8M comments were downloaded and each took about ~10 hours.
  • It's amazing to see all of HN stories and comments so far fits in to under just 1GB compressed!

Issues and Suggestions

Please let me know any issues and suggestions in comments. You can also file issue at "shell" Github repo I'd created for this data.

Moving from dasBlog to WordPress

I've written earlier why I decided to move my site to WordPress instead choosing Jekyll or keep updating my custom code. In this post I'll go in to some details on how I moved to WordPress with the hope that others might have easier time.

Previously I'd decided to use dasBlog because it was fairly minimal and hackable. In the end I modified dasBlog in such a way that it would be hard to tell for normal users where my own code ended and dasBlog started. As it happens with so many open source projects, people moved on and now this project isn't even updated since 2012. So please move on!

Installing WordPress

As I still have some legacy ASP.Net code so I decided to host WordPress on IIS. Fortunately famous 5-minute installation claims does holds on Windows as well. You just install it through Microsoft Web Platform Installer (WebPI) and off you go, well, except few things.

  1. It's best to install and test everything on your local machine first and then move it to web host. By default WebPI uses WebMatrix server but you might want to use full IIS with all its goodies for experimentation. There are plenty of instructions on installing IIS on Windows.
  2. Search for WordPress in WebPI and choose WordPress product that has WordPress logo, avoid variants such as Brandoo. In WebPI, make sure you click on options:


    On the options screen you should select these options:


  3. When WordPress installation dialog comes up select New Web Site instead of using default (its a good practice!) and specify some local folder for all WordPress files. WebPIWordPressOptions
  4. Start IIS, stop Default site and start wordpress site. Navigate to localhost, fill in username, password and you should be able to log on to brand new WordPress website!
  5. I would highly recommend that you move WordPress installation to a subfolder instead of keeping it in the root. This has several advantages. First, you keep WordPress files separate so during updates there are no worries for overwriting your own stuff in root. Second, in root folder your can host your own code or override WordPress behavior using URL redirects. Finally, this arrangement allows you to host other Web applications and sub-sites in to its own folders sitting next to WordPress. The instructions are very easy and your external WordPress URLs don't change by doing this.
  6. There are few essential settings you want to no set: Timezone in Settings > General and url formats in Settings > Permalinks. For permalinks I used the Custom Structure with value


Exporting from dasBlog

The easiest way to migrate posts and comments from dasBlog is using DasBlogML tool. Unfortunately it seem to have gotten lost from Web altogether after MSDN folks decided to reorganize few things. I've put the copy I used on GitHub and for me the process went smoothly without errors. If you do encounter some issues there - are - few - posts out there for help.

Importing to WordPress

While WordPress doesn't have any built-in way to import BlogML content, there is a plugin BlogML Importer. Again, this plugin hasn't been maintained and is broken with current version of WordPress. So I forked it on GitHub and updated it with the fix. You just need to install original plugin and overwrite files I've on repository. Also look at this article for tips.

Cleaning Up the Markup

Over the period I'd use quite a few tools to post to my blogs and some of these produced a really messed up HTML. So one thing I needed was to clean up the markup in my posts. The Tidy2 plugin that can be downloaded from within Notepad++ is a godsend for this purpose. However you might need to invest significant time in configuring it. I've put the Tidy2 config that I tweaked for hours at Github. This config is fairly robust and does good job at cleaning bad markup in HTML fragments, even those awful MS Word extensions.

Managing Redirects for the Old Links

One of the things I care about a lot is making sure that links on my website remains valid during the moves. I've practically obsessed over this even if this website is not popular and there are hardly any old links out there pointing back here. But still, I have 301 redirects from all the way back from year 2000 so even those links remains valid today after 3 major technology stack changes. With IIS Rewrite Maps things have become much more easier. Here's what I did: Create a file that looks like below with list of URLs for individual posts dumped from the DasBlogML tool plus others you add manually. For categories you should have one URL for each category in dasBlog.

	<rewriteMap name="ShitalShahV3Redirects">
		<!-- redirects for the pages -->
		<add key="/aboutme.asp" value="/about/author/"/>
		<add key="/aboutme.aspx" value="/about/author/"/>
		<!-- etc -->
		<!-- 401s detected from Google Webmaster tools -->
		<add key="/?s=CategoryView.aspx" value="/p/category/" />
		<add key="/blog/CategoryView.aspx" value="/p/category/" />
		<add key="/?s=content/AllComments.xml" value="/comments/feed/" />
		<add key="/?s=CommentView.aspx" value="/comments/feed/" />
		<add key="/blog/content/AllComments.xml" value="/comments/feed/" />
		<!-- redirects for the feeds -->
		<add key="/blog/SyndicationService.asmx/GetRss" value="/feed/"/>
		<add key="/blog/SyndicationService.asmx/GetAtom" value="/feed/atom/"/>
		<!-- Redirects for categories -->
		<add key="/blog/CategoryView.aspx?category=AI" value="/p/category/machine-learning/" />
		<add key="/blog/CategoryView.aspx?category=Announcement" value="/p/category/personal-news/" />
		<!-- etc -->

Now you can put reference above map in your web.config. Below example also takes care of other URL patterns that dasBlog had. This however does not take care of guid based URLs that dasBlog had as permalinks. Unfortunately its just too much effort to mine theme and map them to new WordPress URLs. I used Google Webmaster Tools to find external guid links that were getting 404s. For me there were only couple so it was a quick fix.

		<!--Include the map -->
		<rewriteMaps configSource="ShitalShahV3Redirects.config" /> 
			<!-- If we find match in the map then just that -->
			<rule name="dasBlogTitleRedirects" stopProcessing="true">  
				<match url="(.*)" />  
					<add input="{ShitalShahV3Redirects:{REQUEST_URI}}" pattern="(.+)" />  
				<action type="Redirect" url="{C:1}" appendQueryString="false" redirectType="Permanent" />  
			<!-- Redirects for various URL patterns that dasBlog provided -->
			<!-- date based URLs -->
			<rule name="dasBlogDateRedirect" stopProcessing="true">  
				<match url="^blog(.*)" />  
					<add input="{QUERY_STRING}" pattern="(?:^|&amp;)date=(\d+)-(\d+)-(\d+)(?:&amp;|$)" />
				<action type="Redirect" url="/p/{C:1}/{C:2}/{C:3}/" appendQueryString="false" redirectType="Permanent" />  
			<!-- month based URLs -->
			<rule name="dasBlogMonthRedirect" stopProcessing="true">  
				<match url="^blog(.*)" />  
					<add input="{QUERY_STRING}" pattern="(?:^|&amp;)month=(\d+)-(\d+)(?:&amp;|$)" />
				<action type="Redirect" url="/p/{C:1}/{C:2}/" appendQueryString="false" redirectType="Permanent" />  
			<!-- year based URLs -->
			<rule name="dasBlogYearRedirect" stopProcessing="true">  
				<match url="^blog(.*)" />  
					<add input="{QUERY_STRING}" pattern="(?:^|&amp;)year=(\d+)(?:&amp;|$)" />
				<action type="Redirect" url="/p/{C:1}/" appendQueryString="false" redirectType="Permanent" />  
			<!-- Any other URLs -->
			<rule name="dasBlogRootOtherRedirect" stopProcessing="true">  
				<match url="^blog\/(.+)" />  
				<action type="Redirect" url="/?s={R:1}" appendQueryString="false" redirectType="Permanent" />  
			<!-- Blog's root -->
			<rule name="dasBlogRootRedirect" stopProcessing="true">  
				<match url="^blog[\/]?" />  
				<action type="Redirect" url="/" appendQueryString="true" redirectType="Permanent" />  
			<!-- main website old redirects -->
			<rule name="defaultAspxRedirect" stopProcessing="true">  
				<match url="^(default\.asp[x]?)$" /> 
				<action type="Redirect" url="/" appendQueryString="true" redirectType="Permanent" />  
			<rule name="wordpress" patternSyntax="Wildcard">
				<match url="*" />
					<add input="{REQUEST_FILENAME}" matchType="IsFile" negate="true" />
					<add input="{REQUEST_FILENAME}" matchType="IsDirectory" negate="true" />
				<action type="Rewrite" url="index.php" />

Themes, Plugins, Pages, Commenting and General Organization

I'll write more detailed post on how to find programmer-friendly themes and essential plugins to make WordPress more hackable in a separate post (the short answer is I'm using Decode for the theme). However eventually question will come and haunt you if you should use page or post for X where X = photo albums or your projects or articles and so on. I eventually settled on the principle to use post for pretty much everything except for few rare cases such as About and Disclaimer. The primary reason is that posts can be categorized which ultimately appears as navbar on my website. Also I've stopped treating posts as immutable pieces of textual stream that they were decade ago in RSS word. Instead I look at them as evolving articles that gets refreshed as new information becomes available. Just that underlying principle has helped me to clear up my mind about using posts as opposed to pages for most scenarios.

I decided not to write my own gallery code or use WordPress's built-in option (which I think is pretty bad). The thing is that photo galleries are finally becoming sophisticated enough that just like blog engines they would take a lot of your time to do them really well. That's the time you could have been working on more interesting problems. So I left hosting of photo galleries to PicasaWeb (with Flickr - they are offering whooping 1TB for free!). The way this would work is I will simply create a post for new albums that will have description and link to PicasaWeb and/or Flickr.

Next thing to get rid of is hosting my projects, binaries and code on my website. Github has evolved to be an obvious choice to browse, view and download so there is now little point in creating my own stuff to do the same thing. So again, the strategy is to create a post for each new project that points to my repos on Github.

Finally I also decided to dump built-in commenting system of WordPress. For an experiment I'd left it turned on for few weeks and I got 150 spam comments. It's huge pain to clean that up and disappointing that even in 2014 WordPress out-of-box comment system is just not usable. There are plugins like Akismet but its not completely free. The next obvious option was Disqus. They have proven that they can scale, they are robust, have great community support and most importantly, they allow exporting all your data so you can switch to something else if you want to. Despite of all these positives, I did encountered few unhappy moments. For example, current markup injected by Disqus actually doesn't validate for HTML5. This is super bad for product that is almost viral. I contacted their support which apparently didn't consider that was an alarming issue and asked me to post it in their community forums where all their devs handout. Huh? Why can't their support just forward it to their own devs instead of me having to find them in their public forums?

Where Do You Host This Thing?

I'd WebHost4Life as my hosting provider for very long time. However recently they have been going downhill. Their control panel is ancient and a mess of Frankenstein apps. They haven't yet gotten around to supporting latest versions of .Net, IIS and so on. Just doing FTP on their servers gave me nightmares by frequent errors and disconnections. Plus their prices are no longer competitive. So I took this opportunity to check out all the cloud providers. It turns out that none of the popular providers (Amazon, Rackspace, Azure) has a viable option for low traffic website like this one at a price that is comparable to something like WebHost4Life (while Azure has option for free website, they don't allow custom domains). Even after recent price cuts from Google, Azure and Amazon hosting website like this can easily cost $30 per month and that's with severe limitations on bandwidth, storage and compute. So I reverted back to finding regular web hosts and zeroed in to folks. These guys are just great. They really have very modern control panel, nice support, easy to manage emails, multiple websites, databases, FTP and so on. Plus their advertised storage and bandwidth is unlimited which had been my primary criteria even though it really doesn't mean that in practice. I just don't want my users to see errors "This website has exceeded its bandwidth quota" ever.

Deploying to Production

Finally its time to move your localhost WordPress installation to actual web host. How do you do it? It turns out that there is no built-in easy way. Sure, you can export your content as archive and import somewhere else but what about all the themes and plugins and customizations you had been doing all along? Fortunately there is fantastic WordPress plugin called Duplicator that worked like a charm in my case. It moved everything without a hitch to my actual server.

Welcome to V4

It's that time of the year again: Upgrading the technology stack behind my site! Actually, much more than that. I'd been neglecting to post here for very long time. Pretty much everything that could happen to prevent me from posting often seem to have indeed happened in past few years. There have been indeed lots of glowing moments of insights, clarity and awesomeness which all now slipped away from my keystrokes to remain buried in the fragile volatile memory of mine. Only thing I can say is, you, poor reader!

People have argued that social media will spell the end of posting on personal websites and blogs. In reality, social media has so much optimized itself on to sharing statuses, links, photos etc that it is rather dull tool for meaningful longer writings especially technical writing. I guess no one in social media currently cares about ability to syntax highlight the code, add LaTeX equations or embed latest commits from Github repos.

The decision to revive this website came with many choices. From the way back in 1990s, I'd insisted to build my own computers for my home and write my own software for my homepage. I enjoyed doing both because I loved obsessing over all the tiny details of hardware specs and software behavior. But then two things happened: Writing blog engines with all bells and whistles started becoming a full time job and 2nd, it's hard to beat MacMini on size, specs, price and ability to run MacOS as well as Windows. Of course, I could dumb down my blog engine to minimum, but then where's the fun?

Result is that last year I finally bought MacMini instead of building my own desktop. This year I decided to shelve my old SyFastPage framework as well as new ccBlog project at least until I'm done with other more important things. The decision got much easier by the fact that WordPress has finally evolved in to something that is robust, easy to use, hackable, extensible and has enormous community support that would be hard to replicate.

These days any hacker can't possibly make a choice to use WordPress without looking at Jekyll and its semi-clones. Initially I was excited by the whole concept of throwing away fat server side stack and having every change archived at Github but as I thought more about it I felt Jekyll wasn't passing this litmus test: Everything Should Be Made as Simple as Possible, But Not Simpler.

If you think about it, even though modern CMS/blog software dynamically generate pages using their fat footprints, for most of the requests content is served right off of the cache. In essence, these fat complex infrastructures are static site generator but storing their generated content in memory instead of disc. This actually enables simplicity in use which would be otherwise be sacrificed to keep technology stack simple.

So the new version of this website is mostly WordPress and I'm pretty happy with everything so far. The reason for "mostly" because I'm still running some code that I wrote using ASP.Net. The ability to use WordPress side-by-side your own code and override any WordPress idiosyncrasy is very important to me. This gives me an escape hatch to write my own code for whatever I wanted using whatever stack I preferred. The source code of old version of this website will remain available like all of my open source projects.

In past, I'd kept content of this blog more personal and less technical because at that time social media didn't existed and many of my friends and family would have glazed their eyes over technical content. Thanks to social media, I can now continue posting all those personal opinionated blurbs there and use this website for sharing something more serious. If you are interested in the former you can follow my social feeds.

Surviving Windows 8 First Encounter

After installing Windows 8 you might quickly find yourself at unease at doing few things the old way or “cornering” the mouse too much. The answer to your frustration is keyboard shortcuts! There are many but below are the ones that would save the day:

Windows + D Jump to Desktop mode
Windows Switch between Start screen and last app
Windows + Q Search Apps
Windows + W Search Settings
Windows + X A popup menu for power users  (Command Prompt, Control Panel, File Explorer, Computer Management etc)
Alt + F4 Close app (Metro app doesn’t have close button)
Windows + E Open explorer
Windows + R The good old Run dialog
Windows + C Open Charms bar (allows you to quickly go to Settings and Search options for the app)
Page Up/Down Move around tiles on Start screen
Ctrl+Shift+Esc Task Manager
Windows + Tab Recent Metro apps
Right Click on Tile Options for uninstall, pin, unpin, size etc
Windows + Z or Right Click inside Metro App Show App specific bar (for example, open file, play button etc)
Windows + . Snap metro app on left

Viewing Venus Transit in Seattle Area

Well, bad news. Weather isn’t looking good for tomorrow’s historic event.
It’s like 80% chance of showers almost entire Pacific Northwest extending even in to Eastern Washington. Even LIGO Observatory is not going to have sun shine. But if
you believe that a hole in the sky might appear just to take pick then there are several events
lined up for public viewing
with free entry and no registration. Public events are much safer way to view Venus Transit
because they would have appropriate solar filters and/or projection boards. Some might even have live webcasts. If that much
anticipated hole in the sky does not appear then here are the options for online viewing:

It would be also good time to brush up on Venus peculiarities and
some historical

Groups, Places and Collectives for Makers in Seattle Area

We went to Mini Makers Faire in Seattle today and one of most surprising thing I learned was how abundant are the local
resources for hackers and makers! Reminded me of geek fairy tales of Homebrew Computer Club that you often hear. Here are some of the things
you want to check out if you are interested in making stuff and live in Seattle area:

  • Metrix Create: Space – I think this was the coolest thing I came to know about.
    They have a shop with everything from sewing machine to electronics. They run lots of interesting workshops & classes. It’s your neighborhood fab lab!
  • Make Seattle Meetup Group – Regular meet ups for Arduino/electronics makers and
    learners to exchange ideas and get help on your projects
  • West Seattle Tools Library – I thought this was the coolest concept. They have a
    collection of 1500 tools that you can checkout for your projects. Just look at their tool of the week series. There is also Fixers Collective who meet up at Tools Library and would be happy to fix your broken stuff or just
    tinker around.
  • Xbot Robotics
    Workshop Space
    – They provide you space to work on your projects along with access to almost everything you need such as
    power tools, soldering stations, oscilloscopes, electronics components, drill presses, sand blasters, table saws, sanders,

There were quite a few of cool things we saw there everything from The
Brain Machine
, Zigduino, Lifesuit, Drawbots, to The Most Useless Machine Ever.

I also jotted down the next classes I want to do at Pratt and All Metal Arts.

BadImageFormatException - This assembly is built by a runtime newer than the currently loaded

Strange thing happened today. I upgraded one of the internal tool to .Net 4.0 without any issues but as soon as I attempt to debug/run the binary, I’ll see this exception:

System.BadImageFormatException was unhandled Message: Could not load file or assembly SomeTool.exe' or one of its dependencies. This assembly is built by a runtime newer than the currently loaded runtime and cannot be loaded.

Normally you see this exception if the machine doesn’t have right run time installed. But this was obviously not the case. Changing build to x86 or x64 didn’t made any difference either. Next I ran peverify.exe which happily reported that there was nothing wrong with the binary image. Finally I needed to pull out the big guns, ask fuslogvw, which would show me if there are any dependent assembly binding that was failing. But that also didn’t produce any boom sounds. So the last resort was to just meditate over the issue for few minutes. And that works. In a sign of enlightenment I saw app.config buried along with bunch of files and it had these lines:

    <?xml version="1.0"?> 
    <startup><supportedRuntime version="v2.0.50727"/></startup></configuration>

Aha! Apparently the app.config doesn’t get updated (may be because it was in TFS?) when VS did the 4.0 upgrade. As app.config didn’t had anything else, just deleting this file solved the issue. I do wonder how many people come across this gotcha.

Burning Man Tips for the First Timers

There is no substitute to reading official Survival Guide but it does leaves out many things. So instead of writing my detailed trip report I thought about all the mistakes I’d made and converting them to tips. Here it goes…

  1. Sun shelter (things that look like REI Alcove) is absolutely essential if you just have a camping tent. I would however not recommend REI Alcove itself because it broke after withstanding 3 days of sandstorms. Obviously you also need nice camping chair to go with it.
  2. Regular tent stacks are not very useful to secure tents on playa. I didn’t believed that and my tent almost came out in sandstorms. The correct way to secure tent in playa is using something called “rebars”. You need 4 rebars for a regular camping tent, each 3” long and 1/2” wide. You need another 4 or 6 for sun shelter. Stores like Home Depot would cut the metal rod for you in these sizes. Of course, you will need big hammer as well to stack them (they don’t have pointy end). Finally, it’s a hazard to leave other end of these rusty metal rods open so you must stick a tennis ball or empty bottles on end of the rod that is sticking out.
  3. Sandstorms! In 2009 there were 3 full days of sandstorms. If you never seen one then here’s how it works: About 20-60 foot tall wall of sand that comes at you at anywhere between 10 miles to 40 miles an hour. The visibility typically drops to 5ft to 50ft and you see brownish sand everywhere around you. This can last for up to 8 hours straight. It actually looks cool and as a matter of fact I intentionally spent lots of time inside sandstorm taking photos (note: Many SLRs are not dust proof). Even on severe storm days almost everyone continued their activities as if it was normal weather. But they were prepared. Here’s what you need to be prepared:

    • You have to have sun glasses with minimum open space around lenses. I had Tifosi Ventoux and they worked kind of OK. A retainer is also essential not to loose them (remember there are no shops to buy extra pair).
    • Mask for the nose and mouth. I used Buff bandana so I can pull it up all the way to sunglasses. Next time I would also be using this, this or this mask so I can be absolutely worry free about staying longer out in dust storms.
    • Hat with cord and visor. This will allow you to walk heads down while visor protects your face. I used this hat and it’s absolutely the best hat I’ve ever owned. Added advantage of this hat is that it’s veil can be put in the direction of wind.
  4. As a first timer I found that 4 nights provides a very good breadth and depth for the experience. The reason it’s perfect is because you have to carry less food and water, not worry too much about getting shower and you still get to see nearly everything that’s out there. I would strongly recommend against staying less than 3 nights.
  5. Make a stop at Reno at least for a night before heading towards Black Rock City. Besides mini-Las Vegas style environment, you can also visit REI, Walmart and Walgreens to get the gear, water, grocery and so on. Pepermill Resort is where I stayed overnight because it’s cheapest awesome hotel in Reno.
  6. A compass is very useful to have to get on the right street when coming back in the night far away from playa.
  7. If you have 3-season camping tent you are probably in trouble. These tents usually have mesh in the walls for ventilation and they have rain fly to protect against rain. This arrangement however is pretty useless to protect against sandstorms. On a typical stormy day I would get about half inch layer of sand inside my 3 season tent and I had to spend an hour every night cleaning things up. Make sure you have duct tapes and things to cover up. Everything outside tent should be tied to rebars so it doesn’t fly away with storms.
  8. Weather at burning man is 110F+. It’s normally not possible to sleep inside the tent after 10 AM. It’s not unusual to stay up until 3 AM. Nights are cold and you will need a fleece jacket. I’d 30 degree rated bag which was bit warmer but worked fine. BTW, try to bring sleeping bag liner to avoid it getting sweaty.
  9. As a first timer it might be hard to understand culture of gifts. For example, someone I don’t know at all would give me something and I would be confused if I should really take it. This can spoil the experience for the giver and the taker. Here’s how it works: If someone gives you a gift, take it with a smile without hesitation and give them a gift in return. It’s that simple. The only thing to remember is to bring bunch of gifts that you can easily carry around. Here are some of the examples of gifts: Custom stickers with your favorite quotes, necklaces, bead jewelry, things that glows, Trillion Dollars Notes, candies, custom printed postcards and so on. A single major mistake first timers make is not to bring gifts to give away.
  10. Bicycles would be your savior. If you can’t bring from home then rent or steal from Reno. BRC is really huge and you would be lucky if you could walk just one street a day. Most events that I was interested usually were many miles apart and it’s impossible to make it on time without bicycle. Besides you really don’t want to get tired after walking 10 miles just to attend couple of events.
  11. As soon as you enter gate you will be handed a map and booklet of events. The map is easy to understand because of it’s semi-circle streets. Any location is specified by using hand of clock and name of the street. The 6 o’clock is the center while 2 o’clock and 10 o’clock are two ends of semi-circle. The street names are different every year depending on the theme.
  12. There is NO assigned camping sites at Burning Man (except for theme camps). Normally all except outer two streets are occupied within first couple of days. When I arrived on Wednesday evening I decided to keep as much distance from outer road as possible because that’s where all the cars keep passing by. But I did not wanted to be too inside because the view of vast playa on the outer street is awesome. I also wanted to stay in middle so distance to get on any side is minimized. Always consider if your neighbors have enough space around their tent to put picnic table or chairs or grills or another car. The best space I found on Wednesday night was at 7:15 and Kinship.
  13. If someone asks you “which camp are you in?” then that usually means “what is the name of your camp?”. You can consult with your neighbor to decide on some name for group camp and put on the sign. You can also tell the location such as 7:15 & Kinship like other first timers do.
  14. The iPhone & AT&T did worked with full 4 bars at Black Rock City in 2009. It’s really really a good choice to turn it off and resist temptation to tweet/facebook for your entire stay.
  15. Here’s one of the most disgusting part of my Burning Man experience: The place I choose to camp was up-winds in the path of potty patties. That means every time wind blew even a little it would smell really really bad. It was horrible. Fortunately I came to my tent only to sleep when smell was gone. It’s really worth to make sure you avoid such smelly spaces.
  16. Don’t wear jeans at Burning Man! Many places that do not allow jeans believe that they are too “casual” for them but at Burning Man you don’t want to wear jeans because they would be too “formal”! I wore my normal jeans on first night and felt so out of place and formal that it was embarrassing. If you must wear jeans then take your old jeans, paint them or tear them or whatever. Shorts are semi-casual options at Burning Man.
  17. The list of events at Burning Man is overwhelming. You do not have to go crazy to attend as many events as possible and it’s best to chose no more than 3 events per day to attend. It’s good idea to reserve one day just sit back at your tent and look at vast playa in mild warm wind :).
  18. Cooking at Burning Man can get tricky because of sandstorms. I’d bought my camping stove which didn’t work out in storms. Fortunately more prepared folks did cooking in their camps and gave away lots of free food. I’d also bought some non-perishable stuff like bananas, milk powder, Sahale snacks, bars, chips and so on. One last option for food is to go back to town via bus or car but that might ruin your experience. Fortunately I did not had that option because I didn’t had car and I never found the bus. One good outcome was that I lost few pounds! Next time I might just buy or rent a grill because it work better in dust storms.
  19. I estimated 1.5 gallon of water per day per person and it worked out well for me.
  20. You can meet and greet your neighbors without feeling awkward. When you cook meal, offer some to them. Always have at least two gifts to give away to your two neighbors while departing. Example of gifts are some sketch you made while at Burning Man or other art work or a poem or your favorite book or some photo you took.
  21. Yes, you do need kitchen sink! You will need it to brush teeth, wash hands etc. It’s not OK to spill water with soap on playa floor but if your sink is small and soap biodegradable then you would be ok to collect water in it and then go to outer street to flung it away
  22. Most people do not shower at Burning Man (or so I think). There are few elaborate large camps that set up showers and some even offer them to others as well. But in general, don’t expect showers. Alternative is wet towel to wipe off of your body.
  23. Many camps on inner circle serves free drinks to anyone who wants but you need to bring your own cup. It helps a lot if you have a cup with lid.
  24. There is a place everybody refers to as “The Temple”. It’s located beyond The Man and many first timers would even miss it because they usually don’t go beyond The Man (it’s a LONG walk too). The Temple is truly sacred in many aspects without being connected to even the concept of religion. It’s humbling and even moving to spend a day at the temple. The Temple burns on next day after The Man burns and more spectacularly.
  25. Burning Man usually has 35,000+ people which causes 2 to 4 hours of traffic jams while leaving. While it’s fun to see the parade of every RV ever made, the time with least traffic to leave Burning Man is on Saturday night after The Man burns. But the flip side is that you will miss even more spectacular burn of The Temple which is on Sunday night. Another flip side of leaving on Saturday night is that you run huge risk of falling sleep while driving. When I left on Saturday night I saw at least 3 accidents on the way and so I decided to just pull over and sleep anyway. For my next time, I would be leaving on Monday late afternoon.

Finally here are pictures from my trip which hopefully would give some idea about what to expect.