Build a web scraper for a specific website (Perl or Ruby)
$30-250 USD
Πληρώθηκε κατά την παράδοση
I need to create a web scraper to gather a list of events listed on a single specific website. The scraper will convert events found on a website's online calendar into files containing the detail of those events.
At this URL you will find a "calendar" listing upcoming events:
[login to view URL]
The scraper must be written in Perl or Ruby. The program should take two command line arguments.
* The first argument should be an integer number of days into the future to scan. For example, if this argument were zero, it would indicate that only today's events should be scanned or if this argument were 2 it would indicate that all events occurring today, tomorrow, and the day after should be scanned.
* The second command line argument should be the name of a local directory where output files are placed.
Each "event" found on the calendar should cause the creation of one output file. The output file name should consist of the date on which the event "occurs" (in the format YYYY-MM-DD) as well as any additional characters to make the output filename unique and the file suffix should be ".yaml". An example of a valid filename might be "[login to view URL]" to indicate the second event occurring on February 17, 2019. The output file format should be YAML. An example of a valid output file is attached.
Each YAML file is built up from scraping the event detail page such as this one: [login to view URL]
In the attached example, the scraped data elements are circled in red. Please note that the detail page shown in the example actually has multiple events on it and should generate multiple output YAML files (one for each date/time).
For example, suppose that the script you create is called [login to view URL] and you invoke it with this command line on a Linux server:
./[login to view URL] 2 /tmp/scrapefiles
It would generate perhaps 6 yaml files in the /tmp/scrapefiles directory.
Ταυτότητα Εργασίας: #18803855
Σχετικά με την εργασία
Ανατέθηκε στον:
I'm one of the best Perl web scraping experts here that's why I can provide you a working script in less than a day for just $100. It will read two command line params and output yaml files just like you want. Yo Περισσότερα
17 freelancers κάνουν προσφορές κατά μέσο όρο $161 για αυτή τη δουλειά
Hi there,I am Web Scraping expert from Bosnia & Herzegovina,Europe. I have carefully gone through with your requirements and I would like to help you with this project ! I can start immediately and finish it within th Περισσότερα
Hi, the project description is clear to me. I'm ready to come out a Perl script for you. _______________________________________
Hello, I will create the web scraper in Ruby, please send me a message so we can discuss more, i have 8+ years of experience, i have done similar job for many web applications. Thanks!
Hi there! I see you are looking for a Perl expert who can build a web-scrapper for you. Here I am! I can offer you 10+ years of working experience and a wide range of projects completed successfully by me. Here are so Περισσότερα
Hello, I have gone through the JD, i can work on the scraper, i have done similar job before, i have 9+ years of experience in ROR, please send me a message so we can discuss further. Thank you!
Hi, I have gone through your requirement to scrape lots of websites. I am EXPERT in building scraping tools /scripts. Hence, I can SURELY work on your project. I am having 4 YEARS of EXPERIENCE in developing PHP-PYTHON Περισσότερα
Hi, I am Ruby developer and DevOps. Skills: - Ruby - System Admin - Docker, Virtualbox , Nanobox, Kubernetes - Hosting & Maintaining any platform - Git, Bitbucket - MySQL , PostgreSQL - Web Scrap Περισσότερα
Hello, After reading your project details I believe I'm suitable for this project. As I'm expert on it with more than 7 years experience. Please feel free to contact me. I am looking forward to hear from you. Περισσότερα
Hi I'm Colin - Ruby developer I have worked in many scraper project written in ruby with the supports of gems like Nokorigi, Headless Chromium/Firefox, PhantomJS, or simple HTTP I hope we can work together in Περισσότερα
Hi , I can achieve this using perl. I have 7byears of experience in perl. I would also like to know the platform you are using to run the script. let me know your preferences. thanks,
Hi I am experienced in web scrapping using Ruby. Using gems Nokogiri to scrap static HTML pages and Watir for JavaScript based web pages (also using watir for browser automation tasks). If you are interested please co Περισσότερα
Certifications & Achievements • Certified ScrumMaster® • Certified PRINCE2® Project Manager • ExStartup Founder with reasonable Exit Product/Project Management Experience • Agile Coach for cultivating Agile Cult Περισσότερα
Hello, I can do what you are looking for. I am using ruby for web scraping (Watir and/or Mechanize gems).