June 4, 2017 § 1 Comment
My research and observation over the last couple of years found that Alexa is a very biased and dishonest ranking system, which is now becoming more crazy everyday. But they are still trying to impose the fact that their ranking matters in business, which seriously got a very negative impact in many ways in the less internet-aware countries like Bangladesh. In this post I’m gonna show a few examples that will clearly indicate how fraudulent ranking they do/did for websites.
Before going into the detail, I would like to share a personal experience on Alexa ranking from some of my experiments in 2016. A site of mine which almost had no visitors (as it has only some password protected backend panels) got a rank of 600 in the country crossing many popular sites, all from my own visits using some techniques that Alexa seemed care about. That was just an experiment ahead of being able of writing an article like this one.
Alexa basically has two type of rankings in their business logic – one that pays them handsome money every month and the other that does not pay a penny. There is definitely a third hidden category that naturally gets enormous amount of visitors on their own capacity and does not need to care about Alexa or any sort of ranking.
Paid ones get an abrupt positive change in their ranking (of course upon payment) no matter what their previous ranking was or how many visitors they really have. Alexa checks no parameters to list them as a top ranked site. I found that even domains that were only booked a few months ago were also placed on top of high traffic sites without a doubt or calculation.
My picked sites for the current comparison/examples are here (just for technical analysis, no offense please) –
A few facts before more detail:
- The first two are the most popular online news portals in Bangladesh, while the latter two are still unknown.
- The first two have a Google Page Rank of 4 and 5 respectively, while the latter two both have 0 (zero).
- The first two domains were registered in 2006 and 2004 respectively (more than 10 years age), while the latter two were registered at the end of 2016 (only a few months age).
Now lets see how much Google, the biggest search engine, has resources to show for them in search results:
- The first one has about 1.2 millions (1280K) pages indexed with Google.
- The 2nd one has 819K pages indexed with Google.
- The 3rd one has only 168 pages indexed with Google and most of them are broken (returned me 404 during my tests).
- The 4th one has 3600 pages indexed with Google.
Now lets see what Alexa ranking they each has at this moment (June 4th, 2017). This is simply shocking:
Now lets see how abrupt their ranks (the 3rd and 4th sites) were changed in Alexa ranking system:
This shows how their rankings climbed up over night. The red circled points are the dates when they were taken from one point to another abruptly. In other words, these are the dates when their paid money took the targeted effects (green tick marks beside the line “Alexa Traffic Ranks” indicate a paid & hence verified site). However, in real situation, change like this is very impossible.
I have many other points to mention here in this regard, but I guess this topic got enough already.
Anyway, these 4 sites used in the current analysis are not the only sites of this kind. There are many more where the similar discrimination was found.
The purpose of this technical analysis is not to humiliate/undermine these 4 sites or publicize them in any way, but to let the decision makers and/or business people who are still considering Alexa a metric to differentiate popularity of a site know that the real fact is really different than just the number shown in rank.
If this helps a single person or an event, the purpose of this post will find its meaning.
Thanks very much for reading this.
September 19, 2015 § Leave a comment
Hello, just wanted to write something coming back after a long time. Please excuse my absence, life has been very busy with so much other priorities.
This post is for those who are using a PHP-DKIM solution and experiencing this below error –
openssl_sign(): supplied key param cannot be coerced into a private key
This happens when you have inputted a wrong private key input in the openssl_sign(). The third parameter is actually the “private key id” received from a call on openssl_get_privatekey(). But often its mistakenly gets passed with a string value of the private key lines.
So, here is the solution –
1) Prepare the “private key id”:
$fp = fopen("/path/to/file/.htkeyprivate", "r");
$privKey = fread($fp, 8192);
$pKeyId = openssl_get_privatekey($privKey, 'optional_passphrase');
2) Use in the openssl_sign:
openssl_sign($dataToSign, $signatureVar, $pKeyId);
If you’re using PHP-DKIM class based solution (object oriented), you may put the below code in the _construct() of the main class:
public function __construct()
$fp = fopen("/path/to/file/.htkeyprivate", "r");
$privKey = fread($fp, 8192);
$pKeyId = openssl_get_privatekey($privKey, 'optional_passphase');
$this->open_SSL_priv = $pKeyId;
Hope this helps you. I will definitely try to write again whenever I get time. Theres a lot to share from regular development experiences but time has bound me.
December 31, 2012 § Leave a comment
2012 really passed very quickly, I still remember the eve of 2012 when I wrote my last blog here. It was so fast (or may be I’ve been slower) that I failed to write another blog since then. A few of my friends, quite a few of my juniors and a good number of clients expected my blog activity, but I couldn’t manage time to write any posts although I always planned a lot.
However, the year wasn’t that good as 2011. I got some lessons from my plans & activities which can be a good help to do better in 2013. I got some good opportunities but didn’t use properly. I sometimes was in frustration for many reasons, then thought & planned many things which I believe would be helpful in the new year 2013.
The most memorable thing in 2012 is that my daughter Suha started going to her school on November 1st. It was a very colorful moment for Surida & me. Since then I feel a more responsible person, and also a more proud father.
In my professional life, I was able to carry my responsibility with the OneSpout job, plus did some good medium sized projects. I contributed to a Denmark based Golf Administration System – which can be a mentionable project in this 2012. Overall, I got some good experiences.
The new year is only 2 hours away, I wish everybody a very happy new year. Hope things will be better in this new year.
December 31, 2011 § Leave a comment
To my friends, colleagues & blog visitors – I’m very sorry for not writing anything for a long time. So I felt I must write the only post of 2011, no more days left to write another. I just wanted to keep you updated on my life & work in this post.
2011 has been a quite good year for me personally, but it was not good for the nation as we didn’t see any significant progress in our national life. A lot of negative things happened in our country, but only a little was positive. This year the nation experienced many people’s disappearances, and the state became the reason of the insecurity of its people. Well, we got some good progress in information technology, outsourcing & freelancing. This is probably the only sector we expect good progress in. Because, this sector is driven by our talented & hard-working young generation.
Personally, I spent a good year with my family. We moved to a new place which is better than before. My daughter brought many excitements in our life. We spent a year without any severe tension. We attended some very gorgeous wedding ceremonies.
In professional life, the year was good except the last 2 months. Our new office really helped, I brought some new people and engaged in my in-house development. We got some good progress in our development – results will appear in mid 2012. Our pace was quite good, I’d want at least same pace in coming months.
I tried myself on oDesk and got success quickly. At some point, oDesk found me as one of their most successful contractors grown in shortest time (e.g. fastest growing contractor), then offered me to act as their Country Manager for Bangladesh. My primary goal was to create more successful contractors from Bangladesh by arranging technical events & seminars, it mainly required inspiring young technical people here. I created quite a few successful contractors using my own experience. It was an awesome experience in my life.
I spent a really good time with my development on web scraping / data mining track. I was very lucky that I got opportunity to contribute to good projects like onespout.com and ave23.com. Moreover, I received a huge response on my web scraping services.
This year I earned a really good experience on daily deals and their aggregations. I worked on varieties of daily deal sites contents, and got an overall good idea about the present and future of daily deals – in terms of development and business.
Another year coming, I’m hoping to do better…and I wish you all the best.
Happy New Year 2012!
July 17, 2010 § 3 Comments
We are getting all kind of web scraping projects. And we know we are capable to do any scraping project. But here is a list of website scraping, data mining, extraction, reporting and parsing services we get frequent requests for:
- E-commerce scraping for price comparison purposes
- Real estate scraping for building property websites
- Job scraping for building job archives
- Business directory scraping for building big addressbook with addresses, emails and phones
- Media scraping in order to build media archives
- Vehicle list scraping
- Social network scraping
- Other scraping jobs like deal data, race information, etc.
I hope this would help our future clients to get an idea about our services.
Contact us via email to: dev <at> proscraper [dot] com OR by filling the form on our website http://www.proscraper.com/
Being a Professional Developer on Website Scraping, Crawling, Data Mining & Extraction, Parsing and Reporting Services
July 17, 2010 § 2 Comments
As you know I have been concentrating on the website scraping stuffs for few months now, I wanted to share some of my experiences in this journey. What I must say before anything is – its been really an enjoyable field of continuing web development life.
It was late 2005 I had started web scraping & parsing stuffs. I had initially developed some contact grabbers to grab contacts from email clients like Gmail, Hotmail and AOL. Later I had some opportunities to develop scraping tools for some (3 or 4) real estate companies – they used to scrape other competitors’ properties by my automated scraping tool to compare prices, etc. In late 2006, I started working for Beendo Corporation to scrape popular email clients and integrate all emails into one place. It was a big project that wasn’t just scraping emails. We had to scrape emails, attachments, contacts as well as develop processes for sending/replying/forwarding emails using respective clients. Also, we added facilities to grab emails & contacts from popular social networking sites like facebook, myspace, friendster, ringo, hi5, etc. and finally we integrated emails from POP3/IMAP servers. After this development, we developed a facebook version for this whole project and released in May 2008 (see details).
In 2009, I have done quite a few web scraping / data mining projects beside my mainstream development – real estate scraping, media scraping, ecommerce scraping for price comparision, fashion products scraping like t-shirt, shoes data, etc. – these all brought a significant expertise to my development. Then I stopped working on scraping projects and concentrated on other more important fields to receive skills. But scraping projects were enjoyable for me always and I was receiving offers from valuable clients. Thus it has been a part of my regular development.
I have developed huge scraping tools to scrape product data from popular stores (clients use those data for price comparison) like google products, buy.com, amazon.com, adorama, newegg, and many more. Few months back, I have developed a job listing site http://joblance.info which lists jobs from popular online job sources – back in the site some scrapers always working to keep it updated.
These all are my past scraping / data mining experiences. Now I’m again concentrating on web scraping / data mining beside other development areas.
Getting good number of scraping offers from past few weeks.
Beside being the professional scraper myself, I am trying to build a team on web scraping to handle more projects. Already started with someone who might need sometime to be expert but going good. Like the E-commerce team, our scraping team will work under its own banner at http://www.proscraper.com/ and will be known as “Professional Scraper“. I am hoping a very nice journey – so far so good.
Please feel free to communicate and discuss about your scraping & data mining projects. Contact us via email: dev <at> proscraper [dot] com
June 15, 2010 § Leave a comment
For my clients – just wanted to put a blog record for the most asked question (how to pay you?) so far:
1. Guideline to Pay Me Using Payoneer:
Or follow the below steps:
- Go to http://www.payoneer.com/
- Click on Load money to card button in the right side
- Load money by email address, type my email address: firstname.lastname@example.org
- The next steps are self-explanatory
After the payment, make sure payoneer has shown you a message like below:
Thank you for loading money onto a Payoneer Prepaid MasterCard. Your request for $AMOUNT_PAID is being processed. We will let you know when the payment is approved. Once approved, the payment (minus any applicable fees) will be loaded to the requested Payoneer card within 2 business days. Both you and the cardholder will be notified by e-mail.
2. Guideline to Pay Me Using MoneyBookers:
- Go to http://www.moneybookers.com
- If you already have an account with moneybookers.com, then Login to your account – OR – Register for one and then Login. It is very simple.
- When you are logged in, please click on “Send Money” button on the top left
- Use my email address where it says “Recipient’s email address”: email@example.com
- The next steps are self-explanatory
Hope this helps. But if you again ask me the same question, I will just paste this link for your perusal so you understand what to do next.-:)