Elektra-Auto-Checkout - Utilities to assist in checkout automation of various commercial and retail sites

Elektra

About This Project

Elektra is designed to automate the process of inventory checking, purchasing items, and generating account login sessions for various commercial and retail sites.

Amazon is the first of many sites to come. Expect weekly additions, though some sites may only receive login and monitor modules and not full fledged checkout.

Note

This project is not intended for resellers. This is a project for educational purposes / experimentation / to help others get an item which they may need.

Progress / Roadmap

Site Login Monitor Checkout
amazon.com
bestbuy.com
newegg.com
evga.com
target.com
walmart.com
  • Add notifications module (Discord, Slack, Twilio)
  • Add auth code fetcher (imap + Gmail)
  • Add account generators

Installation

Requires Go 1.7

go get github.com/ffeathers/Elektra-Auto-Checkout

Use go mod tidy if issues arise with some of Elektra's imported modules.

Getting Started

Below is some example usage of the Amazon module. You can find additional examples for other sites in the examples folder.

Checking stock

If UserAgent is left empty, a user-agent will be automatically generated for you. PollingInterval is the delay in seconds for which a monitor will sleep after every stock check. Once a monitor task is started, it will continue to monitor indefinitely until stock is detected.

monitorData := elektra.AmazonMonitorData{
  UserAgent: "", 
  UseProxies: true,
  Proxies: []string{"IP:Port", "IP:Port"},
  PollingInterval: 3,
  Sku: "ASIN",
  OfferId: "OfferId",
}
  
monitor.AmazonMonitorTask(&monitorData) 
  
log.Println(fmt.Sprintf("SKU %s: In Stock", monitorData.Sku))
Starting a checkout instance

Account Cookies are needed in order to complete a checkout. You can use cookies from your browser or you can create a session using the Amazon login module (not yet implemented).RetryDelay is the amount of time that a checkout task will sleep if there is an error in the checkout flow, before restarting. MaxRetries is the maximum amount of checkout attempts a checkout task will make before it returns. If the return value is false, the task was unable to complete a successful checkout after every attempt made. If it is true, then the checkout was succesful and OrderNum should now be populated with the order number.

If you would like to use your local IP, you can set UseProxies to false. Currently only IP authenticated (IP:Port) proxies are supported but support for user/pass authenticated proxies will be added soon.

checkoutData := elektra.AmazonCheckoutData{
  UserAgent: "",
  UseProxies: true,
  Proxies: []string{"IP:Port", "IP:Port"},
  Cookies: "",
  MaxRetries: 5,
  RetryDelay: 3,
  Sku: "ASIN",
  OfferId: "OfferId",
}
  
orderSuccess := checkout.AmazonCheckoutTask(&checkoutData) 
if orderSuccess {
  log.Println("Checkout successful | order number: " + checkoutData.OrderNum)
}

License

MIT

Owner
Proficient in Python. Currently learning Go & Js (Node). Hope to open source some cool projects in these languages in the future, as I continue to learn them.
null
Similar Resources

Youtube tutorial about web scraping using golang and Gocolly

This is an example project I wrote for a youtube tutorial about webscraping using golang and gocolly It extracts data from a tracking differences webs

Mar 26, 2022

Go program that fetches URLs concurrently and handles timeouts

fetchalltimeout This is an exercise of the book The Go Programming Language, by

Dec 18, 2021

Go program that fetches urls and prepends http:// if missing

fetchautoprefix This is an exercise of the book The Go Programming Language, by

Dec 18, 2021

Fast golang web crawler for gathering URLs and JavaSript file locations.

Fast golang web crawler for gathering URLs and JavaSript file locations. This is basically a simple implementation of the awesome Gocolly library.

Sep 24, 2022

Extract structured data from web sites. Web sites scraping.

Extract structured data from web sites. Web sites scraping.

Dataflow kit Dataflow kit ("DFK") is a Web Scraping framework for Gophers. It extracts data from web pages, following the specified CSS Selectors. You

Jan 7, 2023

Extract structured data from web sites. Web sites scraping.

Extract structured data from web sites. Web sites scraping.

Dataflow kit Dataflow kit ("DFK") is a Web Scraping framework for Gophers. It extracts data from web pages, following the specified CSS Selectors. You

Jan 7, 2023

List-Utils - 🔧 Utilities for maintaining the list of repost sites

SMR List Utils This is a Go CLI tool that helps with managing the StopModReposts blacklist. Install Use GitHub Releases and download binary. Linux Qui

Jan 3, 2022

CDN for Open Source, Non-commercial CDN management

CDN for Open Source, Non-commercial CDN management

CDN Control Official Website: https://cluckcdn.buzz Documentation (Traditional Chinese): https://cluckcdn.buzz/docs/ 简体中文 README: README_CN.md Please

Dec 20, 2021

GoLang utility packages to assist with the development of web micro-services.

GoTil Golang utility packages to assist with the development of web micro-services. Installation As a library. go get github.com/ccthomas/gotil Usage

Nov 26, 2021

Small golang app to assist in managing extended hdwallet keys

Installation go install github.com/provenance-io/extkey/cmd/extkey@latest Encoding Key generation interactive # Using interactive mode ▷▷ extkey Mne

Aug 15, 2022

CDN for Open Source, Non-commercial CDN management

CDN for Open Source, Non-commercial CDN management

CDN Control Official Website: https://cluckcdn.buzz Documentation (Traditional Chinese): https://cluckcdn.buzz/docs/ 简体中文 README: README_CN.md Please

Feb 4, 2022

Automation Tool to auto generate markdown notes from online classes/talks/presentations.

Automation Tool to auto generate markdown notes from online classes/talks/presentations.

autonotes Automation tool to autocapture screenshots and join them with a supplied .srt or .txt file and output a notes file in markdown. Problem? Wat

Aug 29, 2021

Git-auto-push - Auto commit and push to github repositories

Auto commit and push to github repositories. How to use git clone https://github

Dec 19, 2022

Terraform provider to help with various AWS automation tasks (mostly all that stuff we cannot accomplish with the official AWS terraform provider)

Terraform provider to help with various AWS automation tasks (mostly all that stuff we cannot accomplish with the official AWS terraform provider)

terraform-provider-awsutils Terraform provider for performing various tasks that cannot be performed with the official AWS Terraform Provider from Has

Dec 8, 2022

This package includes various utilities and extensions for your Go code.

Go utilities This package includes various utilities and extensions for your Go code. Inspired by lodash Install go get github.com/murat/go-utils@mast

May 11, 2022

A benchmarking shootout of various db/SQL utilities for Go

golang-db-sql-benchmark A collection of benchmarks for popular Go database/SQL utilities Libraries under test database/sql + go-sql-driver/mysql gocra

Dec 19, 2022

The Cloud Posse Terraform Provider for various utilities (E.g. deep merging)

The Cloud Posse Terraform Provider for various utilities (E.g. deep merging)

terraform-provider-utils Terraform provider to add additional missing functionality to Terraform This project is part of our comprehensive "SweetOps"

Jan 7, 2023

This repository provides various utilities to help you build your NFT collection!

Attention! A powerful computer may be required! About This repository provides various utilities to help you build your NFT collection: Generate image

Nov 4, 2022

Automating Kubernetes Rollouts with Argo and Prometheus. Checkout the demo URL below

Automating Kubernetes Rollouts with Argo and Prometheus. Checkout the demo URL below

observe-argo-rollout Demo for Automating and Monitoring Kubernetes Rollouts with Argo and Prometheus Performing Demo The demo can be found on Katacoda

Nov 16, 2022
Comments
  • Amazon pricing data doesn't seem to be working

    Amazon pricing data doesn't seem to be working

    Hey,

    For me I am now unable to get pricing data from the amazon data api you also use. Requesting the product price data using the accept header

    application/vnd.com.amazon.api+json; type="collection(product/v2)/v1"; expand="buyingOptions[].price(product.price/v1)"`
    

    Results in an error when trying to expand the pricing options with a message saying "You do not have authorization for the requested resource."

    {
       "resource":{
          "types":[
             "collection(product/v1)/v1",
             "collection(product/v2)/v1",
             "collection(product.mini/v1)/v1"
          ],
          "url":"/api/marketplaces/ATVPDKIKX0DER/products/B09BDG5RP3"
       },
       "type":"collection(product/v2)/v1",
       "entities":[
          {
             "resource":{
                "url":"/api/marketplaces/ATVPDKIKX0DER/products/B09BDG5RP3",
                "types":[
                   "product/v2",
                   "product.offer-comparison-experience/v1",
                   "product.mini/v1",
                   "product/v1"
                ]
             },
             "type":"product/v2",
             "entity":{
                "asin":"B09BDG5RP3",
                "buyingOptions":[
                   {
                      "price":{
                         "resource":{
                            "url":"/api/marketplaces/ATVPDKIKX0DER/products/B09BDG5RP3/buying-options/0yjXUM2kMWcCfOqbDAGE3qI5hQwZh-jXK4f1IBZ_HYQ%3D/price",
                            "types":[
                               "product.price/v1"
                            ]
                         },
                         "type":"error/v1",
                         "metadata":{
                            "x-amzn-metrics-id":"INTER_RANKED:c0ae1917-6e96-3909-89ae-d48973e76a97",
                            "x-amzn-metrics-id":"MBOS:0bf8dea4-1030-3997-9bc9-73431c69ae7d",
                            "x-api-exp-data":"GLOBAL_STORE_FORT_PLUGIN_OFFER_SUPPRESSION_468651:T1,PRR_UPSELL_BOAPI_INTEGRATION_GSOD_494140:T1,BUYING_OPTION_GLOBAL_STORE_GUARDRAIL_452747:T1,ADBL_HS_MINERVA_LOOM_BUYING_OPTIONS_258906:T1,OFFERSX_BOPIS_CX_ROAR_500067:T1,GLOBAL_STORE_SKIP_FMA_IFD_OVERRIDE_508576:T1,MOD_PRIME_GLOBAL_STORE_SUPPRESSION_481007:T1"
                         },
                         "entity":{
                            "code":"403.1",
                            "details":{
                               "message":"You do not have authorization for the requested resource.",
                               "url":"https://api.amazon.com/shop"
                            },
                            "encryptedInternalInfo":"Z/XkAL9bIxm133wf7FKPpWevX9iWYZAvVGyOTboHMXCYRsNigeslowrGMeJU8+0yuAOwjG5PBDgecAVDLz5H4KQca3/n+V6/xPpf4+1NnSBaYalokWkt1SAGboXkG20r2fPzNPNFqmyD1+TN/aQFRgH2Btd95aaqJeEGGZmcPFereoTPfGGtbpWQV6fF18uaB0MWDwPzFkI6RmMbSkBKeeyj3vbC3JtSj+WR6hJuWG5H3aPjm14KOw00AzyVX9nYDDqrFf5fyjA8so3ghFSJbfEmGHXeyutaCO3XiU45GxI1xjN/SjZFuWgIGVsQwJpsu1ZVoyjWCoB0KIuASz3F2bTvVbmsDR5ysegDG1o8Cks9EFpMWcn1eYbLR1X3lh/Wx/M5M4lvx06N75hyOx4ABrSY5NTL474kolxPI0eNZwZjrZr3Jy3GMmP5SzQYnLUWPdyKvEIDDL+H++/R+yJ0ICF8hiPMc+SbqjoVY1mkzx6eZYshS14NL0KUoBkrD/4yqyUGWLv8jJ0FCdtxj7Lu5xWjCo1yDVxMig9Qa4vlF2yJYUfa5J6InAVtWZAAP5xhRE0crLYZ63Q3n1y0hdTQQ0TB64xjo9rwCdC/e+mB/2M4gXAkkLYWyMUkm8OKJadj",
                            "status":403
                         }
                      },
                      "type":"NEW"
                   }
                ]
             }
          }
       ]
    }
    

    Is this the same for you?

Go-Yahoo-Finance-Daily-Actives - Scrape for the daily actives on yh Finance and save the data to a CSV, and optionally send it to yourself as an email
Go-Yahoo-Finance-Daily-Actives - Scrape for the daily actives on yh Finance and save the data to a CSV, and optionally send it to yourself as an email

Go-Yahoo-Finance-Daily-Actives - Scrape for the daily actives on yh Finance and save the data to a CSV, and optionally send it to yourself as an email

Dec 13, 2022
Interact with Chromium-based browsers' debug port to view open tabs, installed extensions, and cookies
Interact with Chromium-based browsers' debug port to view open tabs, installed extensions, and cookies

WhiteChocolateMacademiaNut Description Interacts with Chromium-based browsers' debug port to view open tabs, installed extensions, and cookies. Tested

Nov 2, 2022
Distributed web crawler admin platform for spiders management regardless of languages and frameworks. 分布式爬虫管理平台,支持任何语言和框架
Distributed web crawler admin platform for spiders management regardless of languages and frameworks. 分布式爬虫管理平台,支持任何语言和框架

Crawlab 中文 | English Installation | Run | Screenshot | Architecture | Integration | Compare | Community & Sponsorship | CHANGELOG | Disclaimer Golang-

Jan 7, 2023
Elegant Scraper and Crawler Framework for Golang

Colly Lightning Fast and Elegant Scraping Framework for Gophers Colly provides a clean interface to write any kind of crawler/scraper/spider. With Col

Jan 9, 2023
[爬虫框架 (golang)] An awesome Go concurrent Crawler(spider) framework. The crawler is flexible and modular. It can be expanded to an Individualized crawler easily or you can use the default crawl components only.

go_spider A crawler of vertical communities achieved by GOLANG. Latest stable Release: Version 1.2 (Sep 23, 2014). QQ群号:337344607 Features Concurrent

Jan 6, 2023
Apollo 💎 A Unix-style personal search engine and web crawler for your digital footprint.
Apollo 💎 A Unix-style personal search engine and web crawler for your digital footprint.

Apollo ?? A Unix-style personal search engine and web crawler for your digital footprint Demo apollodemo.mp4 Contents Background Thesis Design Archite

Dec 27, 2022
DataHen Till is a standalone tool that instantly makes your existing web scraper scalable, maintainable, and more unblockable, with minimal code changes on your scraper.
DataHen Till is a standalone tool that instantly makes your existing web scraper scalable, maintainable, and more unblockable, with minimal code changes on your scraper.

DataHen Till is a standalone tool that instantly makes your existing web scraper scalable, maintainable, and more unblockable, with minimal code changes on your scraper.

Dec 14, 2022
Download Vimeo videos and retrieve metadata in Go.

vimego Download Vimeo videos and retrieve metadata. Largely based on yashrathi's vimeo_downloader. Installing go get github.com/raitonoberu/vimego Ple

Dec 30, 2022
Crawls web pages and prints any link it can find.

crawley Crawls web pages and prints any link it can find. Scan depth (by default - 0) can be configured. features fast SAX-parser (powered by golang.o

Jan 4, 2023
skweez spiders web pages and extracts words for wordlist generation.

skweez skweez (pronounced like "squeeze") spiders web pages and extracts words for wordlist generation. It is basically an attempt to make a more oper

Nov 27, 2022