openssl_sign(): supplied key param cannot be coerced into a private key

September 19, 2015 § Leave a comment

Hello, just wanted to write something coming back after a long time. Please excuse my absence, life has been very busy with so much other priorities.

This post is for those who are using a PHP-DKIM solution and experiencing this below error –

openssl_sign(): supplied key param cannot be coerced into a private key

This happens when you have inputted a wrong private key input in the openssl_sign(). The third parameter is actually the “private key id” received from a call on openssl_get_privatekey(). But often its mistakenly gets passed with a string value of the private key lines.

So, here is the solution –
1) Prepare the “private key id”:
$fp = fopen("/path/to/file/.htkeyprivate", "r");
$privKey = fread($fp, 8192);
fclose($fp);
$pKeyId = openssl_get_privatekey($privKey, 'optional_passphrase');

2) Use in the openssl_sign:
openssl_sign($dataToSign, $signatureVar, $pKeyId);

If you’re using PHP-DKIM class based solution (object oriented), you may put the below code in the _construct() of the main class:

public function __construct()
{
$fp = fopen("/path/to/file/.htkeyprivate", "r");
$privKey = fread($fp, 8192);
fclose($fp);
$pKeyId = openssl_get_privatekey($privKey, 'optional_passphase');

$this->open_SSL_priv = $pKeyId;
}

Hope this helps you. I will definitely try to write again whenever I get time. Theres a lot to share from regular development experiences but time has bound me.

Happy Blogging!

Advertisements

Personal summary at the end of 2011 and on the eve of 2012

December 31, 2011 § Leave a comment

To my friends, colleagues & blog visitors – I’m very sorry for not writing anything for a long time. So I felt I must write the only post of 2011, no more days left to write another. I just wanted to keep you updated on my life & work in this post.

2011 has been a quite good year for me personally, but it was not good for the nation as we didn’t see any significant progress in our national life. A lot of negative things happened in our country, but only a little was positive. This year the nation experienced many people’s disappearances, and the state became the reason of the insecurity of its people. Well, we got some good progress in information technology, outsourcing & freelancing. This is probably the only sector we expect good progress in. Because, this sector is driven by our talented & hard-working young generation.

Personally, I spent a good year with my family. We moved to a new place which is better than before. My daughter brought many excitements in our life. We spent a year without any severe tension. We attended some very gorgeous wedding ceremonies.

In professional life, the year was good except the last 2 months. Our new office really helped, I brought some new people and engaged in my in-house development. We got some good progress in our development – results will appear in mid 2012. Our pace was quite good, I’d want at least same pace in coming months.

I tried myself on oDesk and got success quickly. At some point, oDesk found me as one of their most successful contractors grown in shortest time (e.g. fastest growing contractor), then offered me to act as their Country Manager for Bangladesh. My primary goal was to create more successful contractors from Bangladesh by arranging technical events & seminars, it mainly required inspiring young technical people here. I created quite a few successful contractors using my own experience. It was an awesome experience in my life.

I spent a really good time with my development on web scraping / data mining track. I was very lucky that I got opportunity to contribute to good projects like onespout.com and ave23.com. Moreover, I received a huge response on my web scraping services.

This year I earned a really good experience on daily deals and their aggregations. I worked on varieties of daily deal sites contents, and got an overall good idea about the present and future of daily deals – in terms of development and business.

Another year coming, I’m hoping to do better…and I wish you all the best.

Happy New Year 2012!

Web Scraping, Screen Scraping, Data Mining & Extraction – Overview of Our Services

July 17, 2010 § 3 Comments

We are getting all kind of web scraping projects. And we know we are capable to do any scraping project. But here is a list of website scraping, data mining, extraction, reporting and parsing services we get frequent requests for:

  • E-commerce scraping for price comparison purposes
  • Real estate scraping for building property websites
  • Job scraping for building job archives
  • Business directory scraping for building big addressbook with addresses, emails and phones
  • Media scraping in order to build media archives
  • Vehicle list scraping
  • Social network scraping
  • Other scraping jobs like deal data, race information, etc.

I hope this would help our future clients to get an idea about our services.

Contact us via email to: dev <at> proscraper [dot] com OR by filling the form on our website http://www.proscraper.com/

Thanks.

Being a Professional Developer on Website Scraping, Crawling, Data Mining & Extraction, Parsing and Reporting Services

July 17, 2010 § 2 Comments

As you know I have been concentrating on the website scraping stuffs for few months now, I wanted to share some of my experiences in this journey. What I must say before anything is –  its been really an enjoyable field of continuing web development life.

It was late 2005 I had started web scraping & parsing stuffs. I had initially developed some contact grabbers to grab contacts from email clients like Gmail, Hotmail and AOL. Later I had some opportunities to develop scraping tools for some (3 or 4) real estate companies – they used to scrape other competitors’ properties by my automated scraping tool to compare prices, etc. In late 2006, I started working for Beendo Corporation to scrape popular email clients and integrate all emails into one place. It was a big project that wasn’t just scraping emails. We had to scrape emails, attachments, contacts as well as develop processes for sending/replying/forwarding emails using respective clients. Also, we added facilities to grab emails & contacts from popular social networking sites like facebook, myspace, friendster, ringo, hi5, etc. and finally we integrated emails from POP3/IMAP servers. After this development, we developed a facebook version for this whole project and released in May 2008 (see details).

In 2009, I have done quite a few web scraping / data mining projects beside my mainstream development – real estate scraping, media scraping, ecommerce scraping for price comparision, fashion products scraping like t-shirt, shoes data, etc. – these all brought a significant expertise to my development. Then I stopped working on scraping projects and concentrated on other more important fields to receive skills. But scraping projects were enjoyable for me always and I was receiving offers from valuable clients. Thus it has been a part of my regular development.

I have developed huge scraping tools to scrape product data from popular stores (clients use those data for price comparison) like google products, buy.com, amazon.com, adorama, newegg, and many more. Few months back, I have developed a job listing site http://joblance.info which lists jobs from popular online job sources – back in the site some scrapers always working to keep it updated.

These all are my past scraping / data mining experiences. Now I’m again concentrating on web scraping / data mining beside other development areas.
Getting good number of scraping offers from past few weeks.

Beside being the professional scraper myself, I am trying to build a team on web scraping to handle more projects. Already started with someone who might need sometime to be expert but going good. Like the E-commerce team, our scraping team will work under its own banner at http://www.proscraper.com/ and will be known as “Professional Scraper“. I am hoping a very nice journey – so far so good.

Please feel free to communicate and discuss about your scraping & data mining projects. Contact us via email: dev <at> proscraper [dot] com

Thanks.

My Latest Development Concentration

June 11, 2010 § 6 Comments

It’s been quite a few weeks I didn’t write any post in this blog. I know many of my well wishers and clients regularly check this blog for getting updates about me and my works. So yesterday I thought I would share about my latest development concentration.

Depending on the recent project traffic, I chose two fields to concentrate on for this year (or until other better options come to my horizon) – 1) web scraping and 2) Drupal. Beside these, I have been developing some facebook applications upon request since the last week of last month. As a result, now we are supposed to receive a big chunk of facebook projects this month. If that happens, we will probably be engaged to FB development in the next few months. Else, I will be concentrating on the selected fields – Drupal and web scraping.

The reason of choosing web scraping is the huge development demand in the freelance market. Most of the webmasters now-a-days want to fight with their competitors very smarty but without spending lots time and money. For example – a store owner in Florida or NY can not just sit in relax at his desk today, he needs to continuously check who else is selling similar products and what is their rates…is my rate competitive to the market? blah blah blah… So what he needs to do? Either he needs some people to monitor all competitor stores prices or an automated way to know the prices without using people. Web scraping is the solution here. Good scraping tools can easily bring updated results to this store owner’s computer regularly. This is just an example – there are hundreds cases where scraping is the only way to manage business smartly.

I started learning Drupal development in late 2009 to accomplish 2 projects of one of my most important clients. I had to scrape data from another site, create nodes dynamically and then deploy those data under different categories with the capability of SEO friendly URLs plus some management tasks in the backend. Doing all these, I fall in love with Drupal. I did some more projects later beginning of this year and started being professional on this. Now it is giving me good results. My latest Drupal development was for a big real estate site to automate the importing / updating of their properties from some sort of files supplied by agent companies.

That’s all my recent development concentration. Thanks for reading.

JobLance.inFo – An Effort to Help Out Job Seekers to Find Latest Freelance Jobs Online

April 2, 2010 § 1 Comment

We recently delivered the initial version of a project that helps job seekers and/or freelancers with finding out latest freelance jobs from major and popular job sources into one place. Beside displaying jobs in many ways in the website – we have latest job feeds, popular job feeds, source-wise job feeds, category-wise job feeds and even feeds for a search to make the job searching life easier for job seekers. You might be interested to check the site here: http://joblance.info.

Currently we are grabbing latest jobs from 7 sources – GAF, oDesk, ScriptLance, GAC, EUFreelance, PHPClasses, LinkedIn. We are working to integrate more job sources especially sources that list long term jobs as well as short term freelance jobs & projects.

JobLance is also providing an option to show all jobs on Facebook. We developed a facebook application for them which displays latest 100 jobs. See it here: http://apps.facebook.com/jobsonline. The application is available to add as an widget in your profile (or pages). Here is an example: http://www.facebook.com/joblance?v=app_112592635421981

Some other links about JobLance.inFo –
http://joblance.info/blog – JobLance Blog
http://joblance.info/rss/latest.xml.php – Latest Job Feeds
http://joblance.info/rss/popular.xml.php – Popular Job Feeds
http://www.facebook.com/joblance – JobLance Fan Page on Facebook
http://twitter.com/joblance_jobs – JobLance on Twitter
http://twitter.com/allphpjobs – JobLance PHP Jobs on Twitter
http://twitter.com/joblance_joomla – JobLance Joomla Jobs on Twitter
http://twitter.com/joblance_seo – JobLance SEO Jobs on Twitter

Thanks.

Create Short URL using our URL Shrinker at RUPOM.NET

December 13, 2009 § 3 Comments

Today we have released a free URL Shrinking service at http://rupom.net which creates shortest possible URL from your long URL input. As the usage of internet is increasing on and people being connected to each other using social networks and social medias as well as the usual options like email, personal website, blog, etc., the necessity of a good and short URL has been very important.

For example: I have been a proud father recently and created a blog post in my blog which is available at https://rupom.wordpress.com/2009/10/07/i-have-been-a-proud-father-found-meaning-of-life/. Now I want to share this link with friends & family members but it is too big to remember or memorize for anyone. So I went to http://rupom.net/ and created a short URL http://rupom.net/fatherhood (I used “fatherhood” in the keyword field) which is very easier to remember/memorize. You click on this link and find the correct destination URL https://rupom.wordpress.com/2009/10/07/i-have-been-a-proud-father-found-meaning-of-life/. There are more places this short URL can be used especially in http://twitter.com where you can’t write more than 140 characters in your post.

RUPOM.NET creates shortest possible URL for you that never expires. If you want to use this service from within your website, you can easily do so using our API help or our HTML form code.

To protect spammers, we have set a time gap of 15 seconds between two consecutive requests from an IP address. But still if you need more facilities, please feel free to contact with us. Also, facilities to track your stats can be provided upon request.

Please start using our URL Shrinking service today and send us comments/feedbacks.

Thanks.

Where Am I?

You are currently browsing the Web Development category at Rupom Here.