My chat community on Slack for digital nomads, called #nomads, has almost 4,000 members now. And it’s been going for almost a year now (actually, exactly 6 days and it will be one year, wow!).
Since it’s 4,000 people, we’ve always had issues with the 10,000 message limit for free Slack accounts. It’d cost about 4,000 * $15/m or $720,000/y to pay for, which I can’t afford. So we’ve just accepted it as being there. It’d be fun to read the archive sometimes though.
Slackbot will send you a message when it’s ready with a download link:
The problem is that this export is a .ZIP file full of JSON files.
Everything IS in fact in there, but it’s pretty hard to read.
Also all usernames are replaced by user ids, and there’s no user images to be found in the chat logs. They’re in a different file called users.json. All channel names are also replaced by channel ids, which can be found in channels.json.
Point is, it’s a big maze of files and data.
So I wrote a script that ties everything together and outputs nice Slack-looking HTML files of your entire JSON export. Automatically!
It makes this:
Look like this:
The script does a few things: – Load the users file to identify user ids (and add user pics) – Load the channels file to identify channel ids – Compile daily logs into single channel logs (in JSON) – Generate a layout from channel logs (in HTML) – Generate an index.html with links to all channel logs in your Slack
Until 2 years ago, I used to be a PC person. I had a giant tower desktop computer with fans with flashing lights. I replaced that with a maxed-out MacBook Pro so that I could start traveling and work from anywhere. The problem is, since then I’ve missed PC gaming. All that startup stuff gets so incredibly boring after awhile, and we need to destress. Why even leave your computer screen to destress when you can do it ON YOUR COMPUTER? YES! YES! FREEDOM OF REALITY!
So let’s browse the games in Apple’s App Store, well, they’re not so great. It’s kind of the iOS type stuff but then for OSX. Pretty very very shit.
But that’s stupid, because the MacBook Pro 15″ has two graphic cards, and they’re actually pretty powerful. And the MacBook Pro 13″ and MacBook Air have on-board graphic cards, but they’re fine to play PC games from a few years ago (like Skyrim). So it’s a bit of a shame, we can’t play games on it. And well, destress.
How about GTA V? It’s come out for PC a few months ago, so I wanted to see if I could get it working on my MacBook Pro. I was pretty sure I couldn’t, but I still wanted to try. I mean I’ve been wanting to play this for years, but never had a device for it. I mean, YOU NEED TO PLAY THIS, RIGHT?
I know you can run Windows on Mac with Parallels. But it’s a virtualization app, so it’d never run it with any high performance as the graphics drivers are virtual (software emulated) and not native (hardware). Try it with any game, it’ll probably crash even before playing it, or it’ll be extremely slow.
But then there’s Boot Camp, which lets you run Windows natively (without virtualization) and with high performance on your Mac. After it’s installed you’ll have to reboot to switch to Windows, but that only takes half a minute each time.
Since Apple doesn’t like Windows, it makes it REALLY EXTRA SUPER hard to get Boot Camp to work. Obviously cause they hate Windows and never want you to use it. I get it. But that means it’s full of stupid bugs that you have to figure out yourself how to fix. It took me 10 days. Yes. 10 days of tears. Maybe that’s why I don’t know anybody using Boot Camp. So to save you all the PAIN and time, here is my tutorial with all the tricks to get it working.
What you’ll need
16GB USB stick (not an SD card!), I tried a 8GB one as Apple recommends it, but it wasn’t big enough, yup WHATEVER!
Windows 8 ISO file, in a perfect world you’d buy this from Microsoft, but they make it really hard and want to ship you a physical CD (what the fuck, it’s 2015, let me buy an ISO), so just find an ISO file of Windows somewhere
Steam account to buy GTA V PC (it’s about $50 I think, worth it because you can play it online if you buy it legally)
Prepare Boot Camp
First search for Boot Camp Assistant on your Mac. Click Continue and you’ll see this:
If this is your first time, select ALL boxes. The first one makes your USB stick loaded with Windows and OSX’s boot camp loader, the second one is the Boot Camp drivers it adds, the third one sounds weird but means it’ll partition your drive to set up Windows.
So now click Continue:
Select your Windows ISO file and continue.
It’ll take some time to copy the Windows ISO to your USB stick, and then download the drivers from Apple that are compatible to your Windows version.
When it finishes, you’ll see this partition window. This means it’ll divide your hard drive up in two pieces, one drive for Windows, one for Mac’s OSX. Here it gets really dodgy, because it actually doesn’t work properly EVER.
You need to choose how big your Windows drive should be. To calculate the size: Windows needs about 20 GB to function, then you need some space for your game. GTA V takes 65 GB, so that is 65+20=85 GB. To make it performant I rounded it up to 100 GB. But it depends on how big your games are etc. Skyrim e.g. is less than 10 GB. So you’d need only 30 to 40 GB probably.
But then it doesn’t work
The reason I said this is dodgy is because it’ll probably fail. You’ll see this amazingly descript error probably like me and my friends did:
It took me days to figure out how to fix it. But it comes down to this: (1) free up space on your drive and (2) if it has disk errors or not. Aim to get about 50% free space. For me that was insane because I have a 1TB drive, with 100 GB free, so I had to free up another 400 GB. It helps to just put stuff on an external hard drive while you’re setting up Boot Camp, you can put it back after.
The non-blue stuff on Macintosh HD is my free space, not enough obviously. Make sure you get about 50% free space on your drive. So if you have 256 GB drive, get 125 GB free. At 500 GB, 250 GB free. At 1 TB, 500 GB free. You get it.
Now fix those errors
Even after clearing all that space, Boot Camp will probably still whine and fail again, like it did for me.
That’s because it’ll run into some weird errors on your drive. Those weird errors are because off, well, I have no fucking clue. But they’re there. How to fix this? Well you open Disk Utility.
Click “Verify Disk” and it’ll check your disk. This might take awhile. I got this crazy scary error. If you didn’t get that and it’s verified, then just skip this part.
I was like “wait WHAT? NO!”. My SSD drive was broken? Why did nobody tell me! I rebooted into Recovery Mode (reboot and hold CMD+R). There I opened Disk Utility in there to verify my disk. If your disk is encrypted like mine, you need to unlock it first by right-clicking the disk, selecting Unlock and entering your password.
Then I verified it again, repaired everything and it worked fine. There were no errors. Odd right? Who cares! Because after this it worked. I rebooted into normal OSX mode and started Boot Camp Assistant again. This time I only selected the last checkbox:
Let’s try again
There we go, partition it:
After partitioning, Boot Camp Assistant automatically restarts. And then BAM!
Now Windows doesn’t like our partitions
Yay! It’s Windows! On a Mac! Don’t celebrate too early, because this is where hell starts.
See what that says? “Windows cannot be installed to Disk 0 Partition 3”. Wait WHAT? WHY! Boot Camp was supposed to fix this shit, right? I was supposed to not do anything and Boot Camp would put all the files in the right place, to make it work on Mac, right?
Then you press Format on that partition. And it seems to work but no it doesn’t because it says:
“The selected disk of the GPT partition style”
What does it take for a (wo)man to get a Windows around here?
Well, a lot. After hours of Googling, I figured it out.
You need to reboot back into OSX. Exit the installation. Then hold ALT/OPTION and select Macintosh HD to boot to. Then go back to Disk Utility:
Select your BOOTCAMP partition and go to the Erase tab, then under Format select ExFAT and click Erase. Make sure you’re erasing the correct partition (BOOTCAMP not Macintosh HD).
After that reboot your MacBook into Windows by rebooting and holding the ALT/OPTION key and selecting your USB stick (I think it’s called EFI). It’ll load the Windows install again.
Try selecting the BOOTCAMP partition in the Windows installation again, you can recognize it by the size you made it. For me that was 100 GB (it showed as I think 86 GB). If it still gives an error, go last resort. Remove the BOOTCAMP partition within the Windows installation by clicking Delete.
Then add a new partition by clicking New:
Try installing it on that partition. If that still doesn’t work, you’re out of luck, cause I have no idea either.
And then…it works
You’ll see this.
The problem is that there’s a good chance the Boot Camp drivers for Windows to understand your MacBook (e.g. use WiFi, sound, etc.) aren’t installed. Luckily they’re on your USB stick. In the Start Screen go to search and type File Explorer. Then try to fin your USB stick. Open the Boot Camp folder and find an Install app, open it and let it run. It’ll probably reboot.
Now with all your drivers installed, most of the stuff on your MacBook will work on Windows now. My friend has some problems with the Bluetooth keyboard, but that was an unofficial keyboard. My Apple one worked perfectly. As did my Logitech wireless mouse.
Now let’s make Windows suck less
Okay, so Windows 8 is obviously the worst interface any person has come across. Like Windows 8 itself actually feels pretty solid, if you get out of that insane box square maze mayhem they call the Start Menu now. It’s insane. Who runs this company? So incredibly stupid to do this. My dad just switched to OSX because he couldn’t understand this Start Screen. Biggest fail of the century.
We have no choice though. We want to play games! So to get your start menu (from old times) back, install Classic Shell.
Then set this image as the start button in preferences:
Yay! Now to disable that stupid Start Screen, right-click on the Task Bar, then click Properties, then click the Navigation tab, then check “When I sign in or close all apps on a screen, go to the desktop instead of Start”, uncheck “When I point to the upper-right corner, show the charms”.
Now install Steam
I’ll let you do this as it’s pretty easy. Go to Steam and the top right click Install Steam.
Then search for GTA V. Click Download.
Here’s the problem, GTA V is 65 GB and that will take awhile. You obviously don’t want to be stuck for hours in Windows. The trick here is to install Parallels in OSX (if you haven’t already). Reboot to OSX (hold ALT/OPTION and select Macintosh HD) and set Parallels up so it uses the Boot Camp partition. Open Parallels, select Boot Camp on the right and follow the instructions:
After installing, try playing GTA V. Customize the graphic settings a bit. You can’t play it on super high settings, but you can go pretty far on a MacBook Pro 15″. Like I said, it has an actually really powerful graphics card, so it can run GTA V fine.
Now you can use your Boot Camp partition within OSX with Parallels to download games/software and continue working. Then when it’s finished, reboot to Windows and play your PC games.
It took me awhile to get back into playing games when I did all of this. I mean, it’s like it has to compete with reality, which is already insane for me, and so GTA V felt somewhat “fake” to me for days, until I accepted it was a game, and nothing I did in there would be an actual accomplishment. See, that’s what startup life psychology does to you. And on a serious note, that’s why we should all play more games. Because it helps you get out of your filter bubble.
Going outside to walk your dog? Naaaaaah, why would you! There’s GTA V!
In February of this year, I arrived in Seoul to live there for two months. We could house-sit my girlfriend’s friend’s house in Hongdae, think of it like Shoreditch or Williamsburg of Seoul. It’s hip and has a big art university in the middle.
This is Hongdae. Yes, looks like Japan, don’t say that to a Korean, they will kill you.
But let’s get down to the brass tacks here, John Hancock, the clock is ticking, Charlie.
Getting a 4G SIM
As any traveler, I arrived in the airport and tried to find a SIM card. Korea is very modern in many ways, but it’s also very traditional and weird in others. It’s a high tech country, but you can’t buy a SIM card as a foreigner, you have to rent it and then when you leave give it back. So I did. I went to the Olleh KT desk at the airport:
I told the KT Staff that I needed unlimited data. I know what unlimited means in most countries, it’s 1 or 2 GB and then they slow you down. So I asked “is it really unlimited?” and she nodded. Cool. Let’s try. It was ~$40 per month. They made a copy of my credit card, which gave me some anxiety but there was no other way. She said.
WiFi okay, 4G awesome
We moved in to the house, it was this cute old Korean house:
And I tried the WiFi. For Korean standards it was relatively slow. I think 20 mbps down and 5 mbps up. Then I speedtested my 4G on my phone. It was 100 mbps down and 50 mbps up. It ranged upwards to 70mbps up. What the fuck 1.0.
50 mbps up is insane. Divide that number by 8 (mbps is bit, mb is byte) It means you can transfer 5 megabytes per second. That’s 300 mb per minute, or half a CD. That’s 432 GIGABYTE PER DAY! That’s a normal person’s hard drive, IN A DAY! WHAT THE FUCK 2.0.
Cue my backup situation
I have about 3 TB of backups:
And I have like 6 copies of this 3TB drive spread around the world.
My backup drive’s data goes back to my first computer from 1994. I never lost anything. I almost lost a .BMP image (I drew a tribal scene with a totem pole and a snake) in 1994 when I was 8, because my brother overwritten it on THIS diskette. Yes 18 years ago.
And then I was able to recover it in 2012 (or 16 years later):
Don’t say it’s sexist, it was 1994 and I was 8.
So yes. I never lost any data. I know you can criticize me now if you’re a buddhist. “Everything shall pass” etc. But I don’t care. I like to be 82 years old and then I’m going to watch MY ENTIRE LIFE and everything I made. Whatever you say. Then my kids will save my backups and in 500 years we will all enjoy Pieter’s pictures. Okay they won’t But they might. ANYWAY.
So I said, I have about 3 TB of backups and a 1TB internal SSD.
So since I’m always traveling I have to keep these around, and they’re heavy, bulky and they will crash at some point. I wanted to put them in the cloud. And if anything, with this speed, now was the time.
CrashPlan SUCKS because it caps the upload speed at 2mbps. And its servers suck, they are mostly in the US it seems. The whole service is a joke. So instead I went more raw, I set up an AWS Glacier instance (that means you go to Amazon AWS and ask them to set up a cloud drive for you, it’s not hard, it sounds hard, it’s not hard).
P.S. This is NOT Amazon Cloud Drive (!), this is AWS Glacier. Very different. Although they use the same backend.
The nearest AWS region was Tokyo, so I set it up there. You can always cheaply transfer your whole backup to another region like US or Europe. Or even just have multiple copies for when the asteroids hit the Earth on one side, YOUR DATA WILL PERSEVERE!
I can’t give enough praise to Arq. And the internet is already full of praise for Arq. It pretty much takes all the hard work out of uploading and downloading your backups to Amazon AWS. Oh and it encrypts your data on YOUR side, before sending it to the server.
I then connected my HHDs, yes that’s a rice cooker on the right. Yes I worked in a kitchen for 2 months. Yes that’s fairly optimal:
Here we go
Backing up 3TB into Amazon AWS S3/Glacier at 4 megaBYTE per second via 4G LTE, oh 🇰🇷 Korea mobile internet I love you pic.twitter.com/juoG6zKAKK
Then it was 100 GB, and it didn’t slow down. Then the first 1TB transferred. It kept going.
So what now
So what do you do when you know this is going to take a month. And you’re in Korea. Well of course, you go for Korean BBQ A LOT:
It still took me about 20 days to upload the 4TB. But that was because I suck and I sometimes forgot to connect it and was a way for days too. That’s still an average of 200 GB per day.
So now my backup problem is solved. It’s all in the cloud. If my HDD drives are destroyed, it doesn’t matter. Hopefully. It’s in the cloud. From AWS I can transfer it to other regions, like I said. But also to other providers, fairly easily.
Returning my SIM
When I left Korea, I had to return the SIM. The staff lady from KT looked at me stringently and said “you have an overcharge”. Fuck I knew it. How was I going to explain I uploaded 4,000 GB on my phone. She said, “please pay now”. And gave me the bill. Normally it’s like $10 per GB. So I was going to have to pay $40,000. Oh my god:
That wasn’t 40,000. That was 52,700. Wait. My face:
But we were in Korea, that wasn’t USD, that wasn’t EUR, that was KRW. That means, that was $48. And the overcharge was $8 because I made some voice calls. Not data.
2 months, 4 TB for $48 per month. Thank you Korea, godspeed to you.
Wireless is the future, no it’s the NOW, and Korea shows it. Europe is trying to put expensive fiber into the ground which is already overflowing with broken copper wiring that they’re pushing slow ADSL over. And US is similar to that. What about just going wireless guys/girls? It’s silly we’re still trying to push so much traffic over old wires.
I have no idea how the Koreans do it, but they do it.
So if you have my backup problem. And you want to finally put it in the cloud, and you’re in Asia, and you want to eat BBQ while at it. Just fly to Korea, go housesitting, get a SIM and transfer your HDD to Amazon AWS.
These days building an MVP is easy, launching it is a challenge but if you succeed, usually your site will stay up (since it’s such a basic version). The problem really starts when you start growing your site, and your site’s traffic starts growing with it.
That’s what happened with Nomad List. In half a year it grew to over 500,000 pageviews per month, with over 100 million assets served per month. That’s about 200 assets per page load, mostly pictures of the cities). My tiny Linode NGINX server has been pretty good at handling it, but it hasn’t made the site particularly fast.
The thing is, when your site is going well, you’re now competing with all the other big sites. And they have people dedicated to making everything fast. And the more mainstream you go with your site, the share of “mega fans” of your site that are willing to wait for the site to load will decrease. You’ll be seeing “normal” people visit your site, you know the ones coming from Facebook etc., and if your site isn’t loaded in a few seconds, they’ll call it “slow”.
I mean, honestly, I do the same. So a few weeks ago, I was done with it and stuff had to speed up.
I used Pingdom to test load times. And always picked San Jose, CA as a base server. My Linode is located in London, so I felt that’d be a good test. I must say Pingdom is not incredibly reliable, so I’d take all of this data with a big grain of salt. Load times do vary somewhat even without changing anything.
I already implemented basic caching months ago. Why? Well, the city database had become so big with 550 cities and 65 attributes per city (like apartment rent cost, hotel cost etc.), that it was now over 35,000 data points. That’s fine, but there was calculations done on those and they were done on every page load. I know, stupid. It worked in the beginning with 50 cities and 500 data points, not with 35,000. Doh.
And built a script to generate static pages for each combination of currency (EUR, USD, GBP etc.) and unit system (metric or imperial). A cron job runs it every few hours or so. I know, tedious, but it works for now until I deploy a new version of the site that does this in a better way.
Caching brought down the load time from a crazy 29.89s (which users never saw, as I implemented caching way earlier) to 3.90s.
Conclusion: Caching processing in a page speeds up everything drastically (well doh)
With SPDY enabled, it actually pushed the load time from 3.90s (on SSL) to 3.97s. Not really sure why, maybe Pingdom doesn’t support/measure SPDY? For browsers supporting SPDY, it definitely sped it up noticeable making it much more snappy. Especially on subsequent loads. That means once you’re already on a site, the page loads from then on are much faster. SPDY keeps a live connection between the user and server, so that means it doesn’t have to re-connect on every page load (like HTTP/HTTPS).
Conclusion: SPDY doesn’t change speed significantly according to Pingdom, but does make it snappier on browsers that support it
CloudFront by AWS
@levelsio Why don't you load the static files from a CDN (Cloudfront with a custom origin). The CDN is location aware.
@PizzaPete on Twitter recommended me to set up CloudFront. It’s Amazon AWS’s CDN service. A CDN is pretty much a lot of servers around the world that mirror your content, so that when a user requests the file (like a photo of a city on Nomad List), it gets sent from the closest location to the user (e.g. a user in Amsterdam gets the image from London, a user in Shanghai gets the image from Tokyo).
Setting this up was quite easy.
I simply made a script that after scaling my images, would put them up on S3 with S3cmd:
s3cmd put /srv/http/nomadlist.com/public/assets/cities/500px/amsterdam-netherlands.jpg s3://nomadlistcdnbucket/assets/cities/500px/amsterdam-netherlands.jpg
Then I went into AWS panel and set up a CloudFront distribution that connected to my S3 bucket.
This means that instead of loading the images from https://nomadlist.com/assets/cities/500px/amsterdam-netherlands.jpg, it would now load them from https://d39d3kdh3l.cloudfront.net/assets/cities/500px/amsterdam-netherlands.jpg.
So did it speed things up? Let’s see.
CloudFront doesn’t support SPDY, only SSL. But with my own server’s SPDY enabled and CloudFront over SSL, load time went from 3.97s to 3.86s. Again, a very tiny difference. Without SPDY and just SSL it decreased to 3.82s. Over HTTP, CloudFront’s speed up was the biggest, from 5.40s to 3.64s. That’s almost 2 seconds!
Conclusion: CloudFront seem to improve load times over HTTP, SSL and SPDY with a very strong effect ovr HTTP
Another tip was PageSpeed, recommended to me by KJ Prince. I think I’m pretty late to the party as Google seems to be deprecating it. But I wanted to try it any way.
It decreased load times on SSL from 3.86s to 3.82s. Over SPDY it actually increased load times from 3.97s to 4.06s! Again, over HTTP the effect was strongest: from 5.40s to 4.13s load times. Almost 1 second won.
Conclusion: PageSpeed seem to improve load times over HTTP and SSL, not over SPDY, with a very strong effect over HTTP
PageSpeed + CloudFront
Let’s combine the two and see what happens.
With both enabled, on SPDY we get 3.95s. Without both enabled, SPDY was actually faster! Over SSL we get our fastest time of 3.79s. And over HTTP, we go from 5.40s to our fastest time of 3.56s. Again almost 2 seconds won.
Conclusion: PageSpeed and CloudFront combined seem to improve load times over HTTP and SSL, not over SPDY.
SSL vs. SPDY on Chrome
Since Pingdom doesn’t give us realistic results for SPDY, let’s test it with Chrome on OSX. I tested it from Bangkok, Thailand on a solid connection. Since my server is in London, that’s 9,526 km far. Pingdom’s connection from San Jose to London was about equal at 8,627 km distance.
Conclusion: On Chrome, SPDY seems to improve load times vs SSL, from 5.24s to 3.58s (-1.66s) or a 31% speed increase.
-0.08s | +2%
The best result we got is CloudFront + PageSpeed over HTTP with a speed increase of 34% (-1.84s), the second best was from SSL to SPDY with CloudFront + PageSpeed enabled on both with 31% (-1.66s).
The odd result here that there’s no real difference between using only SPDY (3.97s) or only SSL (3.90s) or using SPDY + CloudFront SSL + PageSpeed (3.95s) or using SSL + CloudFront SSL + PageSpeed (3.97s). That might be explained by that SSL slows down all connections, and SPDY only speeds up connections if everything is on one server. If you use a CDN like CloudFront you add more servers to the load times, which the client has to connect with. On SPDY the first connection actually takes a lot of time. From then on it’s fast. So for some sites, it might be faster to NOT use a CDN if they use SPDY.
PageSpeed’s results were negligible.
I managed to speed up the site on HTTP with 34% and on SPDY with 31%. I will keep CloudFront’s CDN enabled for the mere fact that the CDN still might help speed up connections in remote parts of the world. That’s a guess though.
So I know by now I do most things in programming in a weird and unconventional way, but somehow that has worked pretty well for me. My sites are definitely a bit more buggy than most, but I ship quite a bit faster too. You can’t have it all.
One thing I never was able to learn properly was commenting my code. The way people usually comment code is this:
This was taken straight from PHP The Right Way. The thing I struggle with is that the deeper you get with indentation, the more of a mess this becomes. What if you’re deep in two foreach loops, how do you have any idea which code part starts and ends where? The issue is that the commenting doesn’t segment the code in any way. You can’t see the end of a piece of code.
You can use functions to summarize code in to one line. That solves a lot of things. But making every little snippet of code into a function can slow you down too.
What if there was a middle ground?
I got inspired by HTML, its code is always very clear because you see when a tag starts and ends. It’s pretty clear. So here’s how I comment based on that:
Even more fun is that if you use Sublime Text, you can simply fold down the entire parts between tags like this:
It’s probably wrong in the grand scheme of coding laws written up by the board of bearded men who roam around Hacker News. But that’s fine. I don’t have a beard.
I just spent the last week moving ALL my sites to be HTTPS by default. It’s tedious to set up and pricey ($10/y per domain). But it’s worth it. And if you’re a maker of sites and apps that people from all over the world use (like many of you reading this are), I think it’s your responsibility to set it up by default now. It gives your users increased privacy and security and it even has advantages for us too.
With HTTPS enabled by default, your users are more secure. Because if you’re letting users login with passwords for example, they’re transferred over an unsecure connection. Even personal data like addresses can be sensitive. Many of us are building apps that include chat functionality, the stuff said there can be sensitive too. This gets especially dangerous with people increasingly working on public/shared WiFi’s (e.g. coffee shops, hotels) where snooping passwords is literally as easy as installing Wireshark. I’ve tried it and I was able to read the packets of most of the people on my hotel’s network. That’s insane in 2015.
Better privacy for your users
With internet spreading to more places in the world, there’s a higher chance your site will be used in places that have less freedom of speech than your country. If snooping on other people’s connections is so easy without HTTPS, I think it’s our responsibility as a site/app builders to at least try to protect our users’s security with HTTPS:
Whenever you use an HTTP website, you are always vulnerable to problems, including account hijacking and identity theft; surveillance and tracking by governments, companies, and both in concert; injection of malicious scripts into pages; and censorship that targets specific keywords or specific pages on sites.
Even non-interactive sites should think about this, because the URLs people visit on your site without HTTPS are shared publicly. Stuff like which news a user reads and their particular choice of any other media consumption (think adult content), should be private by default as it increases the odds of being profiled by governments, companies. Even if they have no bad intentions, everyone has a legal right to privacy. For example, reading lots of articles on Wikipedia about terrorism might get you profiled as a terrorist. Luckily Wikipedia already set HTTPS default 2 years ago.
Better referral data
You know how in Google Analytics, you’re getting less and less referral info these days? Well that’s also because HTTP referrers cannot share it with HTTPS hosts and vice versa. It means that if your users come from a HTTPS site previously (like Google itself), you won’t be able to see any referral info. If the referral is HTTPS and you are HTTPS, you can though. With more sites switching to HTTPS as default in the future, this seems like a good choice.
Google ranks you better
(..) we’re starting to use HTTPS as a ranking signal. For now it’s only a very lightweight signal (..) But over time, we may decide to strengthen it, because we’d like to encourage all website owners to switch from HTTP to HTTPS to keep everyone safe on the web.
Google has stated it will now reward HTTPS-default sites with higher ranking in search results. It also rewards fast sites (with low loading times) and sites that are responsive (with a mobile-friendly layout). So that means combining those, theoretically gives you a boost.
There’s one disadvantage:
Load times increase over HTTPS. Right? It makes things slower. But does it always have to be like that? No. Actually HTTPS can be faster than HTTP: