PySpark Experts para contratar

  • PySpark
Sugerencias:
más

Países

Ubicación específica

Exámenes

Tarifa por hora (USD)

Puntaje

En línea

Mostrando 2 resultados
  • Contratar     adamovicivan464
Contratar     adamovicivan464

    adamovicivan464 adamovicivan464

    Serbia $35 USD / hora
    Python Expert And Web Dev
    Serbia
    1.9
    1 comentario 1 comentario $35 USD por hora
    Hi dear client, my name is Ivan Adamovic and i live in Belgrade Serbia. Choosing a proper developer can be a daunting task but when you choose to work with me you will get a partner who can help you succeed. My Skills: * Back-end: - Python(Scraping, ML,DL), Java, Ruby on rails - PHP / Laravel /CodeIgniter /...
    Hi dear client, my name is Ivan Adamovic and i live in Belgrade Serbia. Choosing a proper developer can be a daunting task but when you choose to work with me you will get a partner who can help you succeed. My Skills: * Back-end: - Python(Scraping, ML,DL), Java, Ruby on rails - PHP / Laravel /CodeIgniter / CakePHP -JSON/ JAVASCRIPT (Ajax, Jquery) , Node.js, Angular.js - MySQL / MSSQL / MongoDB / Oracle / DB2 -Ecommerce, SEO , API Intergration * Front-end: - React.js , Vue.js - Responsive , PSD to HTML... - CSS, CSS3, Bootstrap4, Sass, HTML * CMS: - WordPress , Shopify, Prestashop, Magento - Looker , Flynax ,Joomla *Other: Excel VBA,Google Script Selenium,... menos
  • Contratar adamovicivan464
  • Contratar     amithnair1
Contratar     amithnair1

    amithnair1 amithnair1

    India $20 USD / hora
    Data Engineer
    India
    1.6
    1 comentario 1 comentario $20 USD por hora
    I am an experienced data engineer with more than 3 years of experience in spark and Hadoop technology. I have experience in developing spark applications for the following purposes: 1. Ingesting big data. 2. Performing quality checks on the data. 3. Transform data into tidy data sets. 4. Reconciling Datasets. 5. ...
    I am an experienced data engineer with more than 3 years of experience in spark and Hadoop technology. I have experience in developing spark applications for the following purposes: 1. Ingesting big data. 2. Performing quality checks on the data. 3. Transform data into tidy data sets. 4. Reconciling Datasets. 5. Complex Aggregations on big data. I have spent a considerable amount of time understanding the internal workings of spark and hence I can help you solve any performance-related issues that you might be facing in your application as well. Thanks for viewing my profile menos
  • Contratar amithnair1

Hola , aquí encontrarás algunos freelancers fuera de línea

Hola , aquí encontrarás algunos freelancers fuera de línea que coinciden con ""