Dear Bing…

11 Feb

Why does this look so HORRIBLE?

Oh my, my, MY EYES!

Oh my, my, MY EYES!

Who is in charge of your design?

Have you done any A/B testing?

Do you want to be taken seriously?


A while back I happened to catch a BBC documentary called “Traders: Millions by the Minute“. It showed how people were able to trade from home and followed a few key people as they made and lost money.

I’d always be interested in owning shares and possibly dabbling in Forex trading, but the former was prohibitively expensive and the latter mind-bogglingly complex. However, one of the people on the show was using a website that allowed you to copy other people’s trades with whatever money you were able to afford.

This website was eToro.

I think I should take a few moments to put down in writing what I’ve learned in the last few months so that new users can avoid the mistakes everyone seems to make (myself included).

Rule 1: Get Free Money

If you have a friend on eToro already get them to send them their affiliate link. When you join up with that link and deposit a minimum of $100, eToro will give you $50 and your friend $100 of eToro credits in your account. Although you can’t withdraw eToro credits you use them as part of your investment portfolio and they never expire.

At first I thought that sucked, but the more you play with eToro the more you realise that you are always going to keep a few hundred in there even if you had to withdraw the rest. So grab your money while you can.

If you don’t have a link, please feel free to use my link here, then we’ll both be better off. But don’t join up without a link: you’d just be throwing money away.

Rule 2: Use Your Practice Account

Every account on eToro has a practice account with $100,000 in it. Use this to practice before you throw your money around. Get used to the markets’ peaks and troughs, and try and work on a strategy.

Rule 3: If you don’t have much cash, copy other traders or deal in stocks only

If you start off with only a little cash you’re best off dealing only in shares or copying traders. The reason for this is leverage, risk and money management.

If you deal in shares, the likelihood is that the price will vary very little over the course of a day. You can buy $10 worth of shares in Apple and leave it. You will pretty much always have at least some money when you go back to it.

If you put that $10 on currencies, the minimum leverage you can use is x100. That means that once the market moves 100 pips in the wrong direction, you will have lost all your money. 100% of your money.

You could try and use less of your funds, for example $2.50, but to place a trade for $2.50 you will need to use a leverage of x400. Using this leverage the market can move by only 30 pips and your investment is gone.

Of course, it could go the right way and you will have doubled your money, but I’ve seen many times in the past a flex of 100 pips one way before a switch of 150 pips the other way within 5 minutes of each other.

Profitable traders tend to use much lower leverage, such as x10, x25 or x50, but doing so requires less risk but more capital. For example a trade at x10 leverage requires a minimum investment of $100.

Luckily, though, eToro let you copy traders and this means you copy every one of their traders proportionally. Say you invest $100 in copying a trader and they have $1000 dollars in their account. When they place a $100 trade, your account will trade a $10 trade and will copy their leverage, too, meaning this is the only way to get a small fiscal amount at a low leverage.

I strongly recommend following people, at least until you know what you are doing and have learnt from them.

Rule 4: Follow the Right People For You

I could tell you who I think are the right traders, but they’re good for me and me alone. I have, at the time of writing, chosen 3 traders to copy who all bring something unique to the table:

1. @Dluginacci – trader from Ireland. Brilliantly disciplined. At the end of the day he shuts down trades regardless of where they are. You will often see trades that are cut short before they make a profit.

2. @MRoberts – an astro-physicist from the UK who is bright as a button. When he saw a collection of trades on the EURUSD pair go the wrong way, he piled in with a massive trade the other way, waited ’til it covered the current losses of the other trades, then shut them down.

3. @Wintrader999 – Another disciplined trader. From Germany, extremely low risk, but good return.

You may want traders that are really low risk, returning 5% per month consistently. You might want to copy traders who only deal in indices, or commodities, or shares. You must choose the traders that are right for you. Luckily, eToro have made a tool so you can search for just that.

Rule 5: Don’t Copy Traders’ Trades

Unless you’re copying the traders’ leverage also, then this is a really big no-no.

I started by copying traders trades on a higher leverage – so when they placed a 2% equity investment at x50 leverage, I would open a 1% equity investment at x400. To begin with, my luck held, but this system is disaster prone. Sooner or later a spike is going to come that will wipe you out.

The traders know what they’re doing. You don’t. They might close a trade off before it reaches its SL and your trade will sit and lose all its profits and worse.

Rule 6: If you copy someone, make sure you know what their open trade situation is

Only leave the box ticked ‘copy all open trades’ ticked if all of their open trades are in red. Don’t know if they are? Then check out their page. This page (annoyingly) can’t be seen on mobile or in the app. So you have to choose from the menu tab ‘desktop version’ then navigate to their page and click ‘portfolio’ > ‘open trades’.

If in doubt, leave it out

Rule 7: think about investing in shares

Shares are (usually) pretty safe. They generally don’t lose you all of your invested money overnight.

eToro don’t have all shares, by a long, long way, but they have a lot. Invest in shares you think will make money over the course of a year, not over a month, a week, a day or an hour.

Remember: you can’t short shares and shares are only bought one (sometimes twice) per day, so plan your purchase carefully.

Rule 8: Only invest what you can afford to lose

This is the biggest rule. Say you lost all your investment tomorrow. You’d be sad, but would you be homeless, would it affect your standard of living? If the answer is yes, then do not invest this money. You can lose it in an instant.

Last week, the SNB cut interest rates and stopped supporting the Euro at 1.2, which meant that the USD/CHF rate plummeted. There was no liquidity in the market, so stop losses were not hit and people ended up losing $1,000 on $40 trades.

Play it safe and you’ll win.

Rule 9: Give it time and you’ll be fine

If you’re sticking to the rules above and you have lost money in the first week or first month, then give it time. Are you sure the traders you’re copying are worth their salt? If you are (and other people are) then let everything mature.

Investing isn’t a race. It’s more like compound interest. Say you made 2% every day. An increase of 2% is 1.02:

1.02 to the power of 5 (one week) is an increase of 10.4%

1.02 to the power of 20 (~one month) is an increase of 48.6%

1.02 to the power of 240 (~one year without public holidays) is an increase of 11,000 times your investment. Now, this rarely happens. People take out money, get edgier / more cautious as the stakes increase, but even if you return 1% every day, you would, in theory, have 9x your initial investment. 0.5% = 2.3x

So remember to bank profits and don’t worry if you missed the opportunity – the market is always going up and always going down. There’s always a way to make your money.

If you get flustered, you’ll lose the lot.

Rule 10: eToro isn’t magic

eToro isn’t magic. It can’t magic money for you, but if stick to the rules above and put in a little work then the chances are that you will double your money in 6-12 months.


In an apparent move intended to be evil, Google have just rolled out a new SERP interface.

First impressions are:

  1. Although it seems to have more space, it also seems to make everything harder to read
  2. Ads are very much more ‘discreet’ (ie they blend in with the ‘organic’ listings)
  3. The Goooooooooogle pagination no longer straddles main content and r/h side content, making it appear tiny.

Harder to Read

The reason it’s harder to read is that they’ve done away with the underline. Compare the following two screens taken just now with new Google on the left and old Google on the right (take from Google News SERPs as they seem unaffected by the change):



Goooooooooogle is smaller

At first glance, the Gooooooogle pagination looked a lot smaller than it actually was – just 3px difference


Ads are more ‘discreet’

Well I say discreet, I really mean that if you didn’t already know where ads are usually placed on a Google SERP, you would just start clicking ads without realising.


In case you are having difficulty deciphering what is an advert, then take a look at the image below


Wait, Google said “Don’t be evil”, right?

Well, those days are long gone. If Google believed they were building a better ‘search experience’ for the consumers then they wouldn’t have made shops pay to be included in the Shopping search results.

It’s all about the money, honey. And that’s a real damn shame.

I watched the Tom Cruise Sci-Fi movie, Oblivion the other day and wondered, in a geeky sort of way, where the co-ordinates that flash up on a computer screen transpose to.

Set in a dystopian future, a shuttle falls to earth at a long/lat of 41.146576,-73.975739.

And that equates to 17 Mein Drive.

Now, if I were in charge of making up co-ordinates for a movie, I would have chosen something ironic – maybe the Oblivion Alton Towers ride, or the Oblivion Taproom, Florida – but 17 Mein Drive? Didn’t make sense.

So, I googled 17 Mein Drive and I found the following (from – 17 Mein Drive David Feinsilber.

Didn’t mean much, but it was an unusual name, so onto IMDB and we find that David Feinsilber was the visual effects production supervisor for Oblivion.

Great perk of the job.

I tried to get in touch with David to see if he would confirm this, but so far he hasn’t responded. If he does, I’ll update the post.

Hands on a computer - ooooOOoooh. #technology

Alexa vs Reality

I’d always wondered how closely Alexa’s traffic graphs mirror reality. In a recent article on how the Sun’s traffic was diving uncontrollably, I used an Alexa comparison graph to illustrate my point but I’d never really put the time in to measure its statistics. I think it’s about time that I did.

Slapdash Methodology

My methodology was to use two high traffic sites so that the margin of error would be smaller. Pretty basic stuff. I’m sure someone with more high traffic sites and more time could do a better comparison, but I couldn’t see anything out there.

Site Number 1

Site 1's GA graph

So I took a screengrab of the last 2 years’ GA.

Then I got the same site’s graph from Alexa.

Site 1's Alexa Graph

Then I stretched the Alexa graph to make sure the legends matched up.


And tweaked with the colours, removed Alexa watermark and away we go


Then I made some final compensations for baseline disparity to get something that looks like this…

The Results of Site 1


I was a little surprised with this, because the results are much more accurate than I would have imagined. In April through to January, most spikes are faithfully reproduced in Alexa.

Spike A is perfect, but its size is the same on the Alexa graph as future traffic in September 2013, whereas according to GA, we don’t hit traffic like this until Ocbober/November 2013.

Spike B is matched again, but this time the GA spike towers over the Alexa graph and again with spike C, although the dip and subsequent spike is matched perfectly. This happens once more with spike D.

What Does This Mean?

Broadly speaking, traffic peaks and troughs seem to match pretty well. The only discrepancy is the scale of the graph. With Alexa, the traffic increases are, in general, disproportionate. But we should expect this, as Alexa’s metrics are not traffic related, but ‘reach’ related.

The analytics from Alexa is gathered from ‘thousands’ of plugins. A site’s metrics are based on how your site compares to all the others.

The Alexa Traffic Rank of a given website isn’t determined solely by the traffic to that site, but takes into account the traffic to all sites and ranks sites relative to each other.   Since your site is ranked relative to other sites, changes in traffic to other sites affect your site’s rank.


So, if your traffic goes up by 10%, but all the sites above you increase theirs by 20%, Alexa will show a drop in traffic on their graphs.

How about site 2?

Results for 2 mostly mimicked site 1: peaks and troughs mainly mirrored, but the overall scale of the graph looked a little off. So, I decided to overlay the 2 site’s Alexa graphs and their 2 GA graphs. This, I’m afraid was harder to do than it sounds.

Whilst site 1 and site 2 have similar traffic levels, the spikes in site 1 throw the scales out a bit. Whilst this was easy to compensate for with GA, with Alexa it was tricky. Alexa graphs use a logarithmic axis to denote reach, and when I scale the image in Photoshop it does this by default in a linear fashion. I would have had to distort the image to get it to play ball and match up the lines, and perhaps I shall one day, but for now, we’ll have to live with a graph whose axes are slightly out.

2-vs-1 2-vs-1-ga

As you can see, blue (site 1) is regularly above red (site 2) and the period from January-June is shown as a big disparity between the sites. The period from August to December is shown on Alexa as a big increase for blue, but in reality the blue site on trounces the red in real life by mammoth spikes in traffic.


1. Alexa will show which site is ahead of another one, but don’t expect the differences to be as marked as they are in the graph

2. Alexa will show most peaks and troughs of all sites.

3. Alexa is not a proper analytics tool. Please use responsibly.


I was looking through Google Analytics stats recently to see how many poor old sods there are who still use IE8 and it surprised me that the audience share of the Internet Explorer suite of browsers was less than 11%.

The reason that surprised me was that IE was still in the top 3 browsers (at #3, but still hanging in) but getting a titchy market share.

When I started web development, IE was the top of the tree; in fact, IE even had a version for mac. Internet Explorer commanded the web: Netscape was near retirement and starting to hit the bottle, Mozilla was in short trousers, Opera was struggling to pick up girls at the disco  and Firefox, Chrome and Safari were just glints in the tech entrepreneurs’ eyes. By 2004 IE commanded 95% of all browser traffic.

Fast forward 10 years and the top 3 IE browser versions (10,9,8) account for 10.8% of all browser traffic and 95.5% of all IE traffic. These days the top 3 browsers take less than 60% of the market share.

The 10 most popular browsers (based on millions of impressions last month) were:

  1. Safari (26%)
  2. Chrome (22.3%)
  3. IE (11%)
  4. Android (9.9%)
  5. in-app Safari (9.8%)
  6. Firefox (7.9%)
  7. Opera mini (6%)
  8. Mozilla (2.5%)
  9. Mozilla Compatible Agent (1.2%)
  10. Blackberry (1%)

 The Magic 5%

Up until very recently, the  rule of thumb when designing a website was that if it was used by more than 5% of the current (or predicted) user base, a site should behave perfectly in that browser. So reproduce those rounded corners, make sure that any browser quirks were hacked and go that extra mile to ensure everyone has the same user experience.

Those browsers used by fewer than 5% of the audience would not be tested on. Or at least, not properly tested; one would ensure that users could view the site without it looking like a broken mess.

Sounds sensible. It was sensible. But things have to change…

Today’s 5 percenters

I took the top 5% of browsers based on version and browser type and out of the lot just five had over 5% of the market share. FIVE.

  1. Safari 7
  2. Chrome 31.0.1650.57
  3. Safari in-app
  4. Android 4.0
  5. Firefox 25.0

Internet Explorer 10.0 missed the cut by a tiny margin, so if we’re being generous, we can say that developers these days should just develop to 6 browsers, right?

Well, we can’t really develop for in-app Safari, as the app can use a myriad of settings, so we’ll bring it down to 5 browsers again.

But what if the client is using Firefox? Or Amazon kindle? Or Opera mini? What if there’s a slight difference between Chrome 31.0.1650.57 and 31.0.1650.63?

Well, maybe we can see how many browser/version combinations are there and we’ll create some kind of cut off.

Good idea, but if we look at last month alone we recorded 4,470 different browser / version combinations on Google Analytics alone (ie no spiders or bots) and a mind-blowing 725 unique browsers.

4,470 Browsers

This is why the market share of the top 5 browsers is so titchy in comparison to previous years and leaves developers staring into an Escher staircase of browser testing.

1 year ago, we had 2,984 browser/version combos and 652 unique browsers and 7 years ago we had fewer than 40 browsers to deal with. 

Browser Growth 2006-2013

So what next?

Well, I sure as hell ain’t browser testing 5,000 browsers next month, but I have been in situations where the site is deemed successful if it works on the CEO’s machines, regardless of how niche they are.

The most reasonable method for testing would be to ensure it works on an agreed set of the most popular browsers – IE10, Chrome, Safari, Firefox and on an agreed set of tablets/phones.

If a CEO gets in touch saying that the site’s looks funny on a new device launched after the site was launched, or that a jquery/css effect flickers on his obscure phone, then the account handlers have to have an awkward conversation.

Hopefully telling them this story of the 4,500 browsers will save you a bit of cash.

When OK! Magazine relaunched its website recently, I wasn’t expecting much. Another (somewhat) high profile responsive site that completely disintegrates in IE6, IE7 & IE8.

New OK! site rendered in IE8

Remember that for users of windows XP, IE8 is the latest Internet Explorer browser that they can access and is still the 2nd most popular IE browser, with 25% of the IE stats.

So if you’re building a site this winter, make sure you think of the poor buggers who get a broken looking page and not the finished article.

Screen Shot 2013-11-04 at 20.15.24

New Relic


We had a problem today with a 3rd party aggregator, News Now, who have been using wget to scrape the Daily Express site to gather content. All of a sudden their files were ending prematurely by around 100 characters.

I tried it myself with curl and reproduced the same problem, but there were a few odd things about this:

  1. The source code for the same page was not missing any characters.
  2. My working copy on a Unix machine for the same story was intact when using curl and wget.
  3. The HTML on the staging server on EC2, was also intact using wget and curl.

So, after eliminating the impossible (which I won’t bore you with), we were left with a problem that looked very improbable: New Relic were inserting JS code into the head and before the closing html tag to monitor users, but were not updating the HTTP Content-Length header.

Browsers are smart enough to ignore the Content-Length if it’s missing or incorrect, but wget and curl are set up by default to adhere strictly to the content length, hence the discrepancy.

Short Term Solutions

1. Add the ‘–ignore-length’ option to wget.

2. Take New Relic off the live servers.

Medium Term Solution

We spoke to New Relic, who told us we could take the automated JS injection off and instead insert it ourselves onto every page. Doesn’t sound like much fun.

Long Term Solution

The long term solution for this would be for New Relic to update the Content-Length after it has messed around with the HTML, or even remove it entirely, but it doesn’t look like this is going to happen.



The latest in a series of Eureka moments concerning deviously tricky problems, this one drove me nuts. But it’s over now. Monster’s gone. Think of this as therapy.

The Problem

One of our clients (The Daily Express) had an issue affecting YouTube videos on their site. When they embedded the new(-ish) iFrame code on the site, it wouldn’t play for iPad users.

It wasn’t a problem at YouTube’s end, as the play button appeared properly, but there was an invisible layer that prevented the red button being played.


Well, the first suspect in anyone’s line up would be z-index. I used webkit’s web inspector to up the z-index to absurd levels. Starting at 50 and ending up at 9,999,999 before I cut my losses and moved on.

More Clues

The web inspector didn’t show anything in front of the movie and text above and below was selectable. Even the link to YouTube’s page was still clickable/touchable.

Trial by Trial and Error

So, now I cut through the HTML and take out huge chunks of code until it starts working. By reducing this code bit by bit, I could get the player working if I removed all input boxes on the page by using display:none.

Thinking glory was just a minute away, I turned off all custom styles on input boxes expecting to reproduce the success I had by hiding them. Alas, nothing.

To cut a long story short (too late), I made a local copy of the page with local CSS file. Cut the CSS (2,000+ lines) in half, then select the working half. Repeat until you’re down to one line.


So what was it?

a, object, input, :active, :focus {outline:none}

And dissecting further it became clear that just this

:active {outline:none}

is enough to stop YouTube / Brightcove / other iFrame content working on iOS devices.

The Solution

As we only really care about a:active and input:active, we were able to manipulate the CSS line to look like this

a, object, input, a:active, a:focus, input:active, input:focus {outline:none}

and still work.

But Why, But Why?

Good question. One to which I don’t have an answer. The content of iFrames can’t take any CSS from the parent container and as the links at the top of the page were working and the button was not, which makes me wonder. And why would it work when I took out the input boxes on the screen?

Perhaps if I have the time I can investigate, but for now I’m just relieved that it now works and we have a happy client once more.

So you can all have a tinker, I’ve included a video below and embedded that line of code.

The Proof

A few weeks ago I wrote an article on how The Sun allowed GoogleBot to access its site. It appears that this was in contravention of Google’s terms and News International have subsequently revoked GoogleBot’s privileged access, which has resulted in no more access for sneaky chaps like me and a further plummet in traffic for the website.



The Sun in violation of Google Webmaster Guidelines

Before The Sun’s decision to block GoogleBot, it was in clear violation of Google Webmaster guidelines on cloaking, which clearly state that “Cloaking refers to the practice of presenting different content or URLs to human users and search engines. Cloaking is considered a violation of Google’s Webmaster Guidelines because it provides our users with different results than they expected“.

General violations of Google’s cloaking guidelines were often used by many ‘black hat‘ SEO companies and were punished harshly, as BMW found to their cost in 2006, but with Google News, the ‘greyhat‘ technique of serving content to GoogleBot (and hence having your site properly indexed) and blocking the same pages to users is explicitly banned -“If you cloak for Googlebot, your site may be subject to Google Webmaster penalties“.

One cannot help thinking that Google were helpfully notified of this transgression by their friends in the media.

The Sun Dips Below The Mirror On Alexa

For the first time ever, The Sun’s main rival is now ahead in Alexa traffic (presumably the reason that The Sun has withdrawn from the ABC web traffic audit this month). The difference right now is slight, but blocking GoogleBot from its site will ensure that The Sun’s traffic continues to go into free fall for some time to come. beating for the first time


The Sun Almost Disappears from Google News

Google News now only indexes ~200 articles, compared with 10,000 for The Express and each link comes with a parenthesis of death:

Screen shot 2013-10-01 at 16.11.31 Screen shot 2013-10-01 at 16.11.39 Screen shot 2013-10-01 at 16.11.49 Screen shot 2013-10-01 at 16.12.05 Screen shot 2013-10-01 at 16.12.16 Screen shot 2013-10-01 at 16.12.52 Screen shot 2013-10-01 at 16.13.02









It looks like the Sun keeps on falling and will soon be overtaken by the Daily Star online:

Sun vs Mirror vs Daily Star