Implement a script to parse log files stored in S3 doing some simple filtering and computing summary statistics.
Using (Python or ruby)
The goal is to build some summary statistics describing the performance of the service for that
particular day. With that in mind, please generate at least following statistics:
* average and max response time.
* average and max upstream response time.
* count of HTTP codes by endpoint.
Hi i am a software engineer with cloud computing specialty. I have been working on AWS for last 2 years. I have following expertise in AWS services
1. EC2 configuration from console and using java sdk/python boto3
2. EMR configuration and automation through java/python script
3. Data pipeline and productionalize through java/python script
4. Big data processing using Sqoop for data movement from RDS to EMR HDFS and then
to S3 after that using copy command to move to REDSHFIT
5. IAM rules configuration
6. CloudFormation
[login to view URL]
8. Looker/Tableu reporting tools
I am currently working on a big data project using AWS services. I also have good understanding of Agile development and JIRA. I have used tools like JIRA, Jenkins for many projects. You will find me according to your expectations if you give me an opportunity to do this job for you, looking forward for your response
Thanks
$40 USD en 3 días
4,8 (6 comentarios)
2,7
2,7
2 freelancers están ofertando un promedio de $65 USD por este trabajo
I am ready to do this project for u, I can assure you that our work will be guaranteed satisfactory, if I failed to provide what u were looking i won't ask for any money!! I can assure you that u won't feel regret by choosing me. Please give me a chance and i won't disappoint you.
Here is the Deal: I won't ask any money until project is complete u don't even need to assign the project