General information for the business: Working on a Research Project Description of requirements/functionality: I have a requirement to collect a large number of Food images and/or recipes. I am looking for a experienced python developer with skills in Scrapy Package. Microsoft Azure exp would be nice,but it's easy to pickup. The job will contain a) Setup Scrapy pipeline b) Write custom spiders for root domains c) Test Spiders,Configure them so they don't die off or get banned by admins. d) Deploy them to cloud servers (Or to serverless functions) to run e) Download the resources to Cloud Storage account. The technical skills I am looking for are - Python 3.x - Scrapy library - Microsoft Azure Cloud Services - Should be comfortable to work with Git Specific technologies required: python 3 , scarpy OS requirements: Linux Extra notes: Please note that as a fellow developer, readable and properly structured code is very important to me. If you are in habit of naming your methods and def call_api() : or naming your variable as int x6 ,please do not bother applying. Other than that everything is cool.
?Bidding for first time? Click here to learn how StartAJob platform works.
?Want to do the job or have questions for the customer? Write then in the " Write a note or message to the customer"
?You already have the estimated cost and time to complete the work yuo are in the fields "Execution time" and "Payment requested" or leave them empty and can be changed later.