OneSignal Push Notification in PHP: Simplified Function with Example

A very long time I didn’t post a code here. I recently implemented “Web Push Notification” on a few websites using the very popular OneSignal cross platform service. To reuse the method on multiple sites, I simplified the code that handles message sending in a function. In this post I just wanted to share that for my readers.

function osAddPush($oneSignalConfig)
    if (sizeof($oneSignalConfig)) {  
      $notifTitle = html_entity_decode($oneSignalConfig['title'], ENT_QUOTES, 'UTF-8');
      $notifContent = html_entity_decode($oneSignalConfig['brief'], ENT_QUOTES, 'UTF-8');
      $includedSegments = array('All');      

      $fields = array(
        'app_id' => $oneSignalConfig['app_id'],
        'headings' => array("en" => $notifTitle),
        'included_segments' => $includedSegments,
        'isAnyWeb' => true,
        'url' => $oneSignalConfig['url'],
        'contents' => array("en" => $notifContent)
      $thumbnailUrl = $oneSignalConfig['image_url'];

      if (!empty($thumbnailUrl)) {
          $fields['chrome_web_image'] = $thumbnailUrl;

      $logoUrl = $oneSignalConfig['logo_url'];

      if (!empty($logoUrl)) {
          $fields['chrome_web_icon'] = $logoUrl;

      $ch = curl_init();
      curl_setopt($ch, CURLOPT_URL, "");
      curl_setopt($ch, CURLOPT_HTTPHEADER, array('Content-Type: application/json',
                             'Authorization: Basic ' . $oneSignalConfig['app_rest_api_key']));
      curl_setopt($ch, CURLOPT_RETURNTRANSFER, TRUE);
      curl_setopt($ch, CURLOPT_HEADER, FALSE);
      curl_setopt($ch, CURLOPT_POST, TRUE);
      curl_setopt($ch, CURLOPT_POSTFIELDS, json_encode($fields));
      curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, FALSE);

      $response = curl_exec($ch);
      return $response;

    return null;
} // EO_Fn    

Its the function that made my reuse easier.

Now, to make it even easier, let me add an example call on that function:

$oneSignalConfig = array(
    'app_id' => 'YOUR_ONE_SIGNAL_APP_ID', // replace with your app_id
    'app_rest_api_key' => 'YOUR_ONE_SIGNAL_REST_API_KEY', // replace with your app_rest_api_key
    'title' => 'Testing the OneSignal Push',
    'brief' => 'Write your brief or summary content here. This will be shown below the title.',
    'url' => 'CONTENT_URL', // URL of the page/post that you're pushing for
    'image_url' => 'CONTENT_IMAGE_URL',
    'logo_url' => 'LOGO_URL', // logo of the company/website

// now do the call

Replace with your own values and make sure you tested it more than once before announcing it for public.

My next post will be a step by step guideline to integrate OneSignal notifications on any website.

Thanks for reading!

Alexa Website Ranking: A Biased, Fraudulent and Crazy Ranking System

My research and observation over the last couple of years found that Alexa is a very biased and dishonest ranking system, which is now becoming more crazy everyday. But they are still trying to impose the fact that their ranking matters in business, which seriously got a very negative impact in many ways in the less internet-aware countries like Bangladesh. In this post I’m gonna show a few examples that will clearly indicate how fraudulent ranking they do/did for websites.

Before going into the detail, I would like to share a personal experience on Alexa ranking from some of my experiments in 2016. A site of mine which almost had no visitors (as it has only some password protected backend panels) got a rank of 600 in the country crossing many popular sites, all from my own visits using some techniques that Alexa seemed care about. That was just an experiment ahead of being able of writing an article like this one.

Alexa basically has two type of rankings in their business logic – one that pays them handsome money every month and the other that does not pay a penny. There is definitely a third hidden category that naturally gets enormous amount of visitors on their own capacity and does not need to care about Alexa or any sort of ranking.

Paid ones get an abrupt positive change in their ranking (of course upon payment) no matter what their previous ranking was or how many visitors they really have. Alexa checks no parameters to list them as a top ranked site. I found that even domains that were only booked a few months ago were also placed on top of high traffic sites without a doubt or calculation.

My picked sites for the current comparison/examples are here (just for technical analysis, no offense please) –

Alexa Sites to Compare
(Sites to Compare Alexa Ranking)

A few facts before more detail:

  • The first two are the most popular online news portals in Bangladesh, while the latter two are still unknown.
  • The first two have a Google Page Rank of 4 and 5 respectively, while the latter two both have 0 (zero).
  • The first two domains were registered in 2006 and 2004 respectively (more than 10 years age), while the latter two were registered at the end of 2016 (only a few months age).

Now lets see how much Google, the biggest search engine, has resources to show for them in search results:

  • The first one has about 1.2 millions (1280K) pages indexed with Google.
First One's Google Indexing
(First One’s Google Indexing)
  • The 2nd one has 819K pages indexed with Google.
2nd One's Google Indexing
(2nd One’s Google Indexing)
  • The 3rd one has only 168 pages indexed with Google and most of them are broken (returned me 404 during my tests).
3rd One's Google Indexing
(3rd One’s Google Indexing)
  • The 4th one has 3600 pages indexed with Google.
4th One's Google Indexing
(4th One’s Google Indexing)

Now lets see what Alexa ranking they each has at this moment (June 4th, 2017). This is simply shocking:

Example Sites and their Alexa Ranks
(Example Sites and their Alexa Ranks)

Now lets see how abrupt their ranks (the 3rd and 4th sites) were changed in Alexa ranking system:

Abrupt change in Alexa ranking, example # 1
(Abrupt change in Alexa ranking, example # 1)


Abrupt change in Alexa ranking, example # 2
(Abrupt change in Alexa ranking, example # 2)

This shows how their rankings climbed up over night. The red circled points are the dates when they were taken from one point to another abruptly. In other words, these are the dates when their paid money took the targeted effects (green tick marks beside the line “Alexa Traffic Ranks” indicate a paid & hence verified site). However, in real situation, change like this is very impossible.

I have many other points to mention here in this regard, but I guess this topic got enough already.

Anyway, these 4 sites used in the current analysis are not the only sites of this kind. There are many more where the similar discrimination was found.

The purpose of this technical analysis is not to humiliate/undermine these 4 sites or publicize them in any way, but to let the decision makers and/or business people who are still considering Alexa a metric to differentiate popularity of a site know that the real fact is really different than just the number shown in rank.

If this helps a single person or an event, the purpose of this post will find its meaning.

Thanks very much for reading this.

openssl_sign(): supplied key param cannot be coerced into a private key

Hello, just wanted to write something coming back after a long time. Please excuse my absence, life has been very busy with so much other priorities.

This post is for those who are using a PHP-DKIM solution and experiencing this below error –

openssl_sign(): supplied key param cannot be coerced into a private key

This happens when you have inputted a wrong private key input in the openssl_sign(). The third parameter is actually the “private key id” received from a call on openssl_get_privatekey(). But often its mistakenly gets passed with a string value of the private key lines.

So, here is the solution –
1) Prepare the “private key id”:
$fp = fopen("/path/to/file/.htkeyprivate", "r");
$privKey = fread($fp, 8192);
$pKeyId = openssl_get_privatekey($privKey, 'optional_passphrase');

2) Use in the openssl_sign:
openssl_sign($dataToSign, $signatureVar, $pKeyId);

If you’re using PHP-DKIM class based solution (object oriented), you may put the below code in the _construct() of the main class:

public function __construct()
$fp = fopen("/path/to/file/.htkeyprivate", "r");
$privKey = fread($fp, 8192);
$pKeyId = openssl_get_privatekey($privKey, 'optional_passphase');

$this->open_SSL_priv = $pKeyId;

Hope this helps you. I will definitely try to write again whenever I get time. Theres a lot to share from regular development experiences but time has bound me.

Happy Blogging!

2012 passed very quickly for me, now its almost 2013

2012 really passed very quickly, I still remember the eve of 2012 when I wrote my last blog here. It was so fast (or may be I’ve been slower) that I failed to write another blog since then. A few of my friends, quite a few of my juniors and a good number of clients expected my blog activity, but I couldn’t manage time to write any posts although I always planned a lot.

However, the year wasn’t that good as 2011. I got some lessons from my plans & activities which can be a good help to do better in 2013. I got some good opportunities but didn’t use properly. I sometimes was in frustration for many reasons, then thought & planned many things which I believe would be helpful in the new year 2013.

The most memorable thing in 2012 is that my daughter Suha started going to her school on November 1st. It was a very colorful moment for Surida & me. Since then I feel a more responsible person, and also a more proud father.

In my professional life, I was able to carry my responsibility with the OneSpout job, plus did some good medium sized projects. I contributed to a Denmark based Golf Administration System – which can be a mentionable project in this 2012. Overall, I got some good experiences.

The new year is only 2 hours away, I wish everybody a very happy new year. Hope things will be better in this new year.


Personal summary at the end of 2011 and on the eve of 2012

To my friends, colleagues & blog visitors – I’m very sorry for not writing anything for a long time. So I felt I must write the only post of 2011, no more days left to write another. I just wanted to keep you updated on my life & work in this post.

2011 has been a quite good year for me personally, but it was not good for the nation as we didn’t see any significant progress in our national life. A lot of negative things happened in our country, but only a little was positive. This year the nation experienced many people’s disappearances, and the state became the reason of the insecurity of its people. Well, we got some good progress in information technology, outsourcing & freelancing. This is probably the only sector we expect good progress in. Because, this sector is driven by our talented & hard-working young generation.

Personally, I spent a good year with my family. We moved to a new place which is better than before. My daughter brought many excitements in our life. We spent a year without any severe tension. We attended some very gorgeous wedding ceremonies.

In professional life, the year was good except the last 2 months. Our new office really helped, I brought some new people and engaged in my in-house development. We got some good progress in our development – results will appear in mid 2012. Our pace was quite good, I’d want at least same pace in coming months.

I tried myself on oDesk and got success quickly. At some point, oDesk found me as one of their most successful contractors grown in shortest time (e.g. fastest growing contractor), then offered me to act as their Country Manager for Bangladesh. My primary goal was to create more successful contractors from Bangladesh by arranging technical events & seminars, it mainly required inspiring young technical people here. I created quite a few successful contractors using my own experience. It was an awesome experience in my life.

I spent a really good time with my development on web scraping / data mining track. I was very lucky that I got opportunity to contribute to good projects like and Moreover, I received a huge response on my web scraping services.

This year I earned a really good experience on daily deals and their aggregations. I worked on varieties of daily deal sites contents, and got an overall good idea about the present and future of daily deals – in terms of development and business.

Another year coming, I’m hoping to do better…and I wish you all the best.

Happy New Year 2012!

Web Scraping, Screen Scraping, Data Mining & Extraction – Overview of Our Services

We are getting all kind of web scraping projects. And we know we are capable to do any scraping project. But here is a list of website scraping, data mining, extraction, reporting and parsing services we get frequent requests for:

  • E-commerce scraping for price comparison purposes
  • Real estate scraping for building property websites
  • Job scraping for building job archives
  • Business directory scraping for building big addressbook with addresses, emails and phones
  • Media scraping in order to build media archives
  • Vehicle list scraping
  • Social network scraping
  • Other scraping jobs like deal data, race information, etc.

I hope this would help our future clients to get an idea about our services.

Contact us via email to: dev <at> proscraper [dot] com OR by filling the form on our website


Being a Professional Developer on Website Scraping, Crawling, Data Mining & Extraction, Parsing and Reporting Services

As you know I have been concentrating on the website scraping stuffs for few months now, I wanted to share some of my experiences in this journey. What I must say before anything is –  its been really an enjoyable field of continuing web development life.

It was late 2005 I had started web scraping & parsing stuffs. I had initially developed some contact grabbers to grab contacts from email clients like Gmail, Hotmail and AOL. Later I had some opportunities to develop scraping tools for some (3 or 4) real estate companies – they used to scrape other competitors’ properties by my automated scraping tool to compare prices, etc. In late 2006, I started working for Beendo Corporation to scrape popular email clients and integrate all emails into one place. It was a big project that wasn’t just scraping emails. We had to scrape emails, attachments, contacts as well as develop processes for sending/replying/forwarding emails using respective clients. Also, we added facilities to grab emails & contacts from popular social networking sites like facebook, myspace, friendster, ringo, hi5, etc. and finally we integrated emails from POP3/IMAP servers. After this development, we developed a facebook version for this whole project and released in May 2008 (see details).

In 2009, I have done quite a few web scraping / data mining projects beside my mainstream development – real estate scraping, media scraping, ecommerce scraping for price comparision, fashion products scraping like t-shirt, shoes data, etc. – these all brought a significant expertise to my development. Then I stopped working on scraping projects and concentrated on other more important fields to receive skills. But scraping projects were enjoyable for me always and I was receiving offers from valuable clients. Thus it has been a part of my regular development.

I have developed huge scraping tools to scrape product data from popular stores (clients use those data for price comparison) like google products,,, adorama, newegg, and many more. Few months back, I have developed a job listing site which lists jobs from popular online job sources – back in the site some scrapers always working to keep it updated.

These all are my past scraping / data mining experiences. Now I’m again concentrating on web scraping / data mining beside other development areas.
Getting good number of scraping offers from past few weeks.

Beside being the professional scraper myself, I am trying to build a team on web scraping to handle more projects. Already started with someone who might need sometime to be expert but going good. Like the E-commerce team, our scraping team will work under its own banner at and will be known as “Professional Scraper“. I am hoping a very nice journey – so far so good.

Please feel free to communicate and discuss about your scraping & data mining projects. Contact us via email: dev <at> proscraper [dot] com


Payment Guidelines – Payoneer and Moneybookers

For my clients – just wanted to put a blog record for the most asked question (how to pay you?) so far:

1. Guideline to Pay Me Using Payoneer:

Or follow the below steps:

After the payment, make sure payoneer has shown you a message like below:

Thank you for loading money onto a Payoneer Prepaid MasterCard. Your request for $AMOUNT_PAID is being processed. We will let you know when the payment is approved. Once approved, the payment (minus any applicable fees) will be loaded to the requested Payoneer card within 2 business days. Both you and the cardholder will be notified by e-mail.

2. Guideline to Pay Me Using MoneyBookers:

  • Go to
  • If you already have an account with, then Login to your account – OR – Register for one and then Login. It is very simple.
  • When you are logged in, please click on “Send Money” button on the top left
  • Use my email address where it says “Recipient’s email address”:
  • The next steps are self-explanatory

Hope this helps. But if you again ask me the same question, I will just paste this link for your perusal so you understand what to do next.-:)

My Latest Development Concentration

It’s been quite a few weeks I didn’t write any post in this blog. I know many of my well wishers and clients regularly check this blog for getting updates about me and my works. So yesterday I thought I would share about my latest development concentration.

Depending on the recent project traffic, I chose two fields to concentrate on for this year (or until other better options come to my horizon) – 1) web scraping and 2) Drupal. Beside these, I have been developing some facebook applications upon request since the last week of last month. As a result, now we are supposed to receive a big chunk of facebook projects this month. If that happens, we will probably be engaged to FB development in the next few months. Else, I will be concentrating on the selected fields – Drupal and web scraping.

The reason of choosing web scraping is the huge development demand in the freelance market. Most of the webmasters now-a-days want to fight with their competitors very smarty but without spending lots time and money. For example – a store owner in Florida or NY can not just sit in relax at his desk today, he needs to continuously check who else is selling similar products and what is their rates…is my rate competitive to the market? blah blah blah… So what he needs to do? Either he needs some people to monitor all competitor stores prices or an automated way to know the prices without using people. Web scraping is the solution here. Good scraping tools can easily bring updated results to this store owner’s computer regularly. This is just an example – there are hundreds cases where scraping is the only way to manage business smartly.

I started learning Drupal development in late 2009 to accomplish 2 projects of one of my most important clients. I had to scrape data from another site, create nodes dynamically and then deploy those data under different categories with the capability of SEO friendly URLs plus some management tasks in the backend. Doing all these, I fall in love with Drupal. I did some more projects later beginning of this year and started being professional on this. Now it is giving me good results. My latest Drupal development was for a big real estate site to automate the importing / updating of their properties from some sort of files supplied by agent companies.

That’s all my recent development concentration. Thanks for reading.

JobLance.inFo – An Effort to Help Out Job Seekers to Find Latest Freelance Jobs Online

We recently delivered the initial version of a project that helps job seekers and/or freelancers with finding out latest freelance jobs from major and popular job sources into one place. Beside displaying jobs in many ways in the website – we have latest job feeds, popular job feeds, source-wise job feeds, category-wise job feeds and even feeds for a search to make the job searching life easier for job seekers. You might be interested to check the site here:

Currently we are grabbing latest jobs from 7 sources – GAF, oDesk, ScriptLance, GAC, EUFreelance, PHPClasses, LinkedIn. We are working to integrate more job sources especially sources that list long term jobs as well as short term freelance jobs & projects.

JobLance is also providing an option to show all jobs on Facebook. We developed a facebook application for them which displays latest 100 jobs. See it here: The application is available to add as an widget in your profile (or pages). Here is an example:

Some other links about JobLance.inFo – – JobLance Blog – Latest Job Feeds – Popular Job Feeds – JobLance Fan Page on Facebook – JobLance on Twitter – JobLance PHP Jobs on Twitter – JobLance Joomla Jobs on Twitter – JobLance SEO Jobs on Twitter