Senior Software Engineer
Job description
About Us
We are one of the regions fastest growing technology solutions provider. We focus on the delivery of cloud-based technology solutions for clients focused in all disciplines. We are dedicated to fostering an innovate and collaborative culture seeking constantly to push beyond status-quo and motivate teammates to deliver innovate ideas to solutions into industry changing technology capabilities. Or firm aims to create products and solutions that identify, capture, and execute on alpha generating strategies with the securities and alternative marketplace.
With over 300 percent year over year growth five years consecutively, our client focused approach of, listen first recommend second, allows our core commitment to prevail even when facing the most complex issues across multi-disciplinary sectors and industries.
POSITION OVERVIEW
We are looking for an experienced Python Developer to join our team. The successful candidate will be responsible for developing and maintaining software applications written in Python, as well as providing technical support and troubleshooting. The ideal candidate should have a strong understanding of Python programming language, web development frameworks, databases, and software development tools. The successful candidate should also have excellent communication skills and a passion for problem-solving.
This position is meant for a strong python developer who can double as a data engineer that has ability to pull data from varying sources, experience normalizing and scrubbing data, and ultimately load the data into a database. The candiate must be extremely detailed oriented and be willing to contribute to a small team with high visiiblity and expectation to deliver and make difference from day one. Candiate must be able to constructively suggest, recommend, and independtely implement the advancement of our techniological infrastructure.
Candidate must have expert experience leading and implementing data collection techniques (web crawling/ web scrapping using tools like Beautiful Soup, Srcappy, and other techniques.
Required Skills
- debug and troubleshoot existing python code
- enhance existing code by adding new features and data sources
- migrate spreadsheet environment into database
- migrate python exports from excel to newly created database
- create new database and corresponding tables based on data colleted via beautiful soup, scrappy.
- create and set SQL jobs to run from database matching specific parameters and rules to deviver daily reports to downstream partner systems.
- enhance existing ui by implementing reporting features for front-end user to call, view, and interact with data from database
Data Gathering
- Strong experieince with data data collection, gathering, and ingestion techniques.
- Strong use of Python web data collection tools and libraries like Selenium, scrappy, BeautifulSoup when APIs or datasets are not available
- Understanding HTML/CSS, JavaScript, and HTTP methods (for understanding page structure for web scraping)
- Candiate must be prepared to fetch data data via SFTP, FTP, Wget, Curl, REST APIs, GraphQL queries from varying internet websites.
- Experience working with Pandas and comfortable parsing and synthesizing of XML and/or JSON documents.
- Docker (setting up Kubernetes style processing if warranted for data scraping/data ingestion/normalization)
- Knowledge of Puppeteer or other automatable web client technologies
- Python developer experienced in creating, using, and implementing APIs, connecting to multiple sources, and ablity to normalize data to create a database where none exist.
- Python code runs captures all data along with usings data provider apis - selenium / pandas / scrappy/ beautifulsoup/ API
- Data Science/Web Automation background: Python, Jupyter notebooks/Pycharm, Selenium/BeautifulSoup/Requests.
- Database must be updated on varying frequency within database with data being from multiple source systems
HANDLING DATA
- Database - demonstrated experience with creating and designing databases, where none previously exists
- normalize data from multiple sources and store into corresponding tables, while designing storeage into a scalable future proofed database.
- ability to independently create, seed, and maintain new and existing tables with large volumes of data
- Must have intermediate knowledge of SQL with the following databases: PostGres, MySQL, Google BigQuery or intermediate knowledge of No-SQL database knowledge such as MongoDB or similar.
- database must be deisigned with longterm view as database is to support UI and reporting requirements of business
- candidate must have ability to create reports directly from database and create daily / weekly/ varying frequency reports. Ability to create parameter based reports which will automatically capture data based on rules
- database must be built to optmize future reporting requirements and support UI capabilities.
Interacting with Data
- enhance existing UI by developing new functionalites
DevOps
- versioning experience with software releases: GitHub, BitBucket
- experience working in Agile or Waterfall enviornment and tools: JIRA Atlassan
Cloud (Nice to have)
- candidate would preferably cloud experience such as Azure or Google Cloud
- preferable experience: Azure Kubernetes, Azure SQL, App Services, Azure Functions,
- experience writing, testing, deploying code in a cloud environment
- Familiarity with basic Cloud technology such as storage buckets, cloud serverless functions
- nice to have experience in migrating desktop app to web based application
Technical Skills
- Python - selenium, scrappy, beauitful soup
- SQL, Postgres
- Javascript, CSS, React, Angular
- JIRA/ BitbBuket/ Git
Soft Skills
- ability to work with limited direction and still be able to take direction and feedback
- candidate will be assigned "JIRAs" and will provide daily status reports of each JIRA in daily scrum update calls.
- candidate must have experience with ETL processes and be comfortable creating processes to normalize data and store into databse.
- No third party applicants
- not currently sponsoring work visas
Job Type: Contract
Pay: $113,561.00 - $123,798.00 per year
Schedule:
- 8 hour shift
- No weekends
Application Question(s):
- please breifly explain why you would be a good fit for this role? data extraction is key to this role and database experience.
Education:
- Bachelor's (Preferred)
Experience:
- Python: 5 years (Preferred)
- Selenium: 5 years (Preferred)
- web scrapping: 4 years (Preferred)
Language:
- English (Required)
Work Location: Remote
blackflymedia.com is the go-to platform for job seekers looking for the best job postings from around the web. With a focus on quality, the platform guarantees that all job postings are from reliable sources and are up-to-date. It also offers a variety of tools to help users find the perfect job for them, such as searching by location and filtering by industry. Furthermore, blackflymedia.com provides helpful resources like resume tips and career advice to give job seekers an edge in their search. With its commitment to quality and user-friendliness, blackflymedia.com is the ideal place to find your next job.