• 内容资讯,短视频 / D轮及以上 / 2000人以上
    职位职责: 成为全球商业化团队的一员并参与驱动海外业务增长,接触从广告产品到客销关系全链路业务流程并参与数据体系建设,赋能业务。 Be part of the monetization team and contribute to global expansion. Chance to get a full view of monetization products and business operations and participate in building data assets to drive business value. 1、负责面向海外业务梳理业务运营中的数据流程,输出数据产品以及框架性的数据洞察; 2、在深入理解产品细节,业务流程、充分调研同类产品的基础上,以务实创新的原则探索最佳实践并推动落地; 3、亲自动手跑通数据流程,与研发协作,输出高质量的产品原型文档; 4、与相关团队充分协作,充分理解业务难点,把握短期和长期实现目标,制定清晰合理有共识的路线图,通过短期策略支持业务,同时推动长期产品迭代开发和上线。 1. Responsible for streamlining business workflow, delivering data products and data insights frameworks for overseas monetization business; 2. Define best practices based on a deep understanding of business workflow and product details through analysis of industry-leading products, be practical and creative; 3. Hands on the prototype of the data workflow. Collaborate with the RD team. Produce high quality product design documents; 4. Align with stakeholders, fully understand the pain point of the business, deal with both short-term and long-term solutions, and come up with a solid roadmap, support the business with short-term fix and strategy. At the same time, drive long-term product design iterations and product releases. 职位要求: 1、大学本科及以上学历,3年以上产品工作经验,2年以上数据产品相关经验; 2、熟练掌握SQL查询和Debug原数据的能力; 3、熟知数据分析框架,能将场景需求拆解成逻辑清晰的数据产品体系; 4、能独立发现问题并提出解决方案; 5、高质量产品文档及原型产出能力; 6、流畅的英文口语和书面表达; 7、有数据分析,数据科学,数据工程,统计分析相关经验优先。 1. Bachelor's degree or above, 3+ years experience in product management, 2+ years in data products area; 2. Great SQL capability for data processing and debugging; 3. Familiar with the data analysis framework, can break complex business requirements into systematical data systems; 4. Ability to execute independently; 5. Ability to produce high quality PRD and prototypes; 6. Fluent in English; 7. Experience in data analysis, data scientist, data engineering, statistics, analytics is a plus.
  • 25k-40k 经验10年以上 / 本科
    教育 / 不需要融资 / 500-2000人
    岗位职责: Job Title: Head (Data Intelligence Department) Department: Data Intelligence Department, HKUST(GZ) As a higher education institution committed to innovation and data-driven decision-making, HKUST(GZ) is establishing a new Data Intelligence Department aimed at harnessing the power of data to enhance operational efficiency and strategic growth across all levels of the organization. We are seeking a visionary leader to head this critical initiative. The ideal candidate will possess a blend of strategic vision, technical expertise, and leadership skills to build and nurture a high-performing team dedicated to data governance, data analytics and business intelligence. Duties 1. Drive cooperation among various departments to develop and implement a university-wide data governance strategy that aligns with the University's mission and strategic goals. 2. Establish the framework of data governance and develop long-term plans for the development of AI/HPC Data Center and Smart Campus, and promote the utilization of computing technologies and improvement of energy efficiency to achieve the vision of a "Sustainable Smart Campus". 3. Develop and promote the implementation of campus data governance policies, including the standards of the allocation and usage of computing resources and smart campus services, policies for the entire data lifecycle management as well as category-based and class-based data security and protection. 4. Lead the innovation in the management and services of the AI/HPC Data Center to meet the computational demands of research/teaching and future development of the society and enhance the university’s scientific research capabilities by improvement of operational efficiency and service capabilities. 5. Oversee the operation and intelligent upgrades of Smart Campus systems including multimedia and research/teaching facilities, office and life services, security access, and smart buildings, ensuring efficient operation and interaction of the systems, and upgrading the design of intelligent campus services. 6. Supervise the compliance of data to ensure it meets both domestic and international data protection laws and standards,ethical norms, and the university’s data confidentiality requirements. 7. Guide the data teams from various University's departments to establish and optimize data processing workflows, foster a culture of data-driven decision-making, promote data accumulation and then to form data assets and driving continuous institutional improvement through the strategic use of data across the University. 任职要求: A qualified candidate should: 1. Hold a bachelor's degree or above in Data Science, Computer Science, or a related field. 2. Have at least 10 years of relevant work experience, including data management and team leadership in large organizations. 3. Possess excellent communication and teamwork skills, able to collaborate across departments to promote the implementation of data policies. 4. Be familiar with domestic and international data security and compliance policies and related regulations, such as China's Data Security Law, China's Cybersecurity Law, etc., with experience in handling data compliance in multiple jurisdictions. 5. Have strong strategic planning, organizational management, innovation management, and change implementation capabilities. This is a Mainland appointment, and the appointee will be offered a contract by HKUST(GZ) entity in accordance with the Mainland labor laws and regulations. Starting salary will be commensurate with qualifications and experience.
  • 20k-40k 经验3-5年 / 本科
    内容资讯,短视频 / D轮及以上 / 2000人以上
    职位职责: 1、通过数据分析、特征挖掘、逻辑推理等方法,还原互联网风险事件链路,主导风险案事件分析; 2、通过风险分析提炼风险规律与特征,实现风险数据标准化沉淀,从而推动业务逻辑优化、防控规则、模型与数据产品的升级; 3、深入了解风控业务,总结案事件分析的方法论,并推动产品化沉淀。 职位要求: 1、统计、数学、计算机等类数据专业毕业,熟练使用SQL,数据意识强; 2、熟悉互联网数据与风控业务,2年以上互联网风控策略或案件分析工作经验; 3、逻辑思维能力强,好奇心强,善于钻研与挖掘; 4、沟通协同能力强,善于寻求不同资源完成工作。
  • 25k-40k 经验5-10年 / 本科
    旅游|出行 / D轮及以上 / 500-2000人
    We are seeking a data professional with combined expertise in data analysis and data warehousing to join our dynamic team. The ideal candidate will focus on leveraging data analysis to drive overall data initiatives. This role will be responsible for designing and optimizing data models, handling complex datasets, and improving data quality and processing efficiency. We expect the candidate to work independently while also collaborating closely with the team as needed to drive data-driven business growth. Key Responsibilities: - Design and maintain data warehouse architecture to support various business lines, including but not limited to Attractions, Mobility, and Hotels. - Develop a deep understanding of each business line and use data analysis to support business decisions and strategy development. - Build and optimize data models to ensure data accuracy and reliability. - Independently handle and optimize complex datasets, enhancing data quality and processing workflows. - Collaborate closely with both business and technical teams to ensure data solutions meet business requirements. - Write technical documentation and maintain the data warehouse’s data dictionary. Qualifications: - Over 5 years of experience in the data warehouse field. - Proficient in SQL and database technologies, with hands-on experience managing large-scale databases. - Strong experience in data model construction, with the ability to independently design and optimize complex data models. - Extensive experience in data quality and underlying data processing, with the ability to effectively resolve data issues. - Familiarity with DBT and practical experience is preferred. - Strong analytical thinking and problem-solving skills, with the ability to complete projects independently. - Excellent communication and teamwork skills, capable of effectively interacting with team members from diverse backgrounds.
  • 25k-40k 经验5-10年 / 本科
    消费生活 / 不需要融资 / 2000人以上
    工作职责 1.Perform an assessment on all visualization and reporting requirements and develop long term strategy for various dashboard & reporting solutions. 2.Effective communication with business partners to understand their needs, and design reports accordingly 3.Collect and understand business logic behind all the reports to translate it into data model design requirement 4.Manage projects, prepare updates and implement all phases for a project 5.Turn data into insights with actionable execution plans and influence key stakeholders to implement the solutions 6.Provide training to business teams on BI tool usage and dashboard creation 任职要求 1.Proficient in using SQL, Python and R for data manipulation and analysis 2.Experienced in data visualization and dashboard development with tools like Tableau 3.Excellent presentation, project management and people management skills 4.Bachelor degree or above in statistics, business analytics, mathematics, computer or relevant education with training on data and analytics 5.5+ years of experience of managing analytics&BI projects, with successful project experiences in using data to drive business value for senior analyst. 6.Experience in data governance, data quality management, data processing and insights 7.Strong in data visualization tool such as Tableau, PowerBI.etc), as well as in BI portal products 8.Excellent in planning and organization of project, able to use data to identify and solve problems 9.Experience in retail, CRM, supply chain and production is a plus
  • 45k-65k 经验5-10年 / 本科
    其他 / 不需要融资 / 2000人以上
    Your role QIMA has a 35%/year growth pace, 20% thanks to acquisitions. It is paramount that we manage to integrate quickly our new acquired companies so that they can extend the benefit of our state-of-the-art data management & dashboards to our new clients and colleagues. Data integration plays a key role in this integration: how do we manage to understand quickly and unambiguously the data of the newly acquired company? How do we connect this data to our existing data flows? Data plays a key role at QIMA: as we master the entire data flow, from data collection (with our own inspectors, auditors, and labs), data processing (BI and data scientists) and data actionability (insights for our customers and for our managers). Data is spread around different departments and expertise inside the company: Marketing, Operations, Sales, IT, … Data governance is key, and collaboration around data will unlock the potential to bring even more value to our customers about the quality of their products, and to our managers about their operations. These main challenges about data lead us to look for our Head of Data. In this role, your main responsibilities will be, but not limited to: -Project Management oImagine the business cases of the data projects by exchanging with stakeholders, and deliver them oLead the transversal projects around our datawarehouse and cloud ETL oLead the Master Data Management projects, leveraging the key skills and technologies already in place across departments oDrive the data integration of newly acquired companies within QIMA group in order to synchronize reporting dashboards and provide a transversal understanding of the business oLead the discussion and integration projects with external partners oTrack results and provide continuous improvement oBe responsible for budget, roadmap, quality and delivery of these projects -People Management oManage the Data Engineering and the Business Intelligence teams -Community Animation oAnimate data governance across domains and departments oBe the guardian of data quality in the group, challenge data inconsistencies and ensure that data is shared by all departments in any circumstances oImplement knowledge sharing practices inside the data community oBe responsible of data lineage and data quality -Management of the run oCooperate with our IT support organization to create the support of the newly created systems oOrganize and manage the day-to-day operation and support of this system Requirements: In order to succeed in this role, you must have: -Master's degree in computer science -Extensive experience and knowledge on Data solution architecting -Experience in transversal projects -Are a hands-on person, autonomous, at ease to discuss with a CEO or a field operator -Open minded, agile with change and pragmatic -Ability to drive a workstream and train final users -Can work in a multinational environment and on multiple simultaneous projects -Strong communication skills, both oral and written -Excellent teamwork and interpersonal skills -Are fluent in English: daily use required with our colleagues all over the world; -If you are based in Europe, you are willing and able to travel.
  • 35k-50k 经验5-10年 / 本科
    其他 / 不需要融资 / 2000人以上
    Do you have experience in architecting data at a global organization scale? Do you enjoy working on cutting edge technologies while supporting end-users achieving their business goals? About QIMA You will be linking our new companies in Americas, our teams in Europe (mostly France) and our teams in Asia (China, Hong Kong, Philippines). Your role QIMA has a 35%/year growth pace, 20% thanks to acquisitions. It is paramount that we manage to integrate quickly our new acquired companies so that they can extend the benefit of our state-of-the-art data management & dashboards to our new clients and colleagues. Data integration plays a key role in this integration: how do we manage to understand quickly and unambiguously the data of the newly acquired company? How do we connect this data to our existing data flows? Data plays a key role at QIMA: as we master the entire data flow, from data collection (with our own inspectors, auditors, and labs), data processing (BI and data scientists) and data actionability (insights for our customers and for our managers). Data is spread around different departments and expertise inside the company: Marketing, Operations, Sales, IT, … Data governance is key, and collaboration around data will unlock the potential to bring even more value to our customers about the quality of their products, and to our managers about their operations. These main challenges about data lead us to look for our Head of Data. In this role, your main responsibilities will be, but not limited to: - Project Management o Imagine the business cases of the data projects by exchanging with stakeholders, and deliver them o Lead the transversal projects around our datawarehouse and cloud ETL o Lead the Master Data Management projects, leveraging the key skills and technologies already in place across departments o Drive the data integration of newly acquired companies within QIMA group in order to synchronize reporting dashboards and provide a transversal understanding of the business o Lead the discussion and integration projects with external partners o Track results and provide continuous improvement o Be responsible for budget, roadmap, quality and delivery of these projects - People Management o Manage the Data Engineering and the Business Intelligence teams - Community Animation o Animate data governance across domains and departments o Be the guardian of data quality in the group, challenge data inconsistencies and ensure that data is shared by all departments in any circumstances o Implement knowledge sharing practices inside the data community o Be responsible of data lineage and data quality - Management of the run o Cooperate with our IT support organization to create the support of the newly created systems o Organize and manage the day-to-day operation and support of this system Requirements: In order to succeed in this role, you must have: - Master's degree in computer science - Extensive experience and knowledge on Data solution architecting - Experience in transversal projects - Are a hands-on person, autonomous, at ease to discuss with a CEO or a field operator - Open minded, agile with change and pragmatic - Ability to drive a workstream and train final users - Can work in a multinational environment and on multiple simultaneous projects - Strong communication skills, both oral and written - Excellent teamwork and interpersonal skills - Are fluent in English: daily use required with our colleagues all over the world; - If you are based in Europe, you are willing and able to travel. We offer: a competitive package, performance bonus, fast career progression and international career opportunities.
  • 25k-35k·14薪 经验5-10年 / 硕士
    IT技术服务|咨询 / 上市公司 / 2000人以上
    职责描述: ? Develop AI and machine learning solutions to optimise business performance across different areas of the organisation; ? Build exploratory analysis to identify the root causes of a particular business problem; ? Explore and analyze all the to understand customer behavior and provide better customer experience; ? Use data analytics to help in-country Marketing, Distribution, Operation, compliance and HR teams to increase revenue, lower costs and improve operational efficiency; ? Identify actionable insights, suggest recommendations, and influence the direction of the business of cross functional groups; ? Build MIS to track the performance of implemented models to validate the effectiveness of them and show the financial benefits to the organization; ? Support the data analytics team on data extraction and manipulation; ? Help decision analytics team on the creation of documentations of all analytical solutions developed; ? Build data infrastructure necessary for the development of predictive models and more sophisticated data analysis; ? Development of data analysis for different business units 任职要求: ? Studying a numerate discipline e.g. Actuarial science, Mathematics, Statistics, Engineering, or Computer Science with a strong computer programming component; ? Demonstrable understanding of data quality risks and ability to carry out necessary quality checks to validate results obtained; ? Knowledge of a variety of machine learning techniques (clustering, decision tree, random forest, artificial neural networks, etc.) ? Proven hands-on experience in the use of at least one advanced data analysis platform (e.g. Python, R); ? Sound knowledge of programming in SQL or other programming experience; ? Articulate, with excellent oral and written communication skills. Fast learner with a willing attitude. Intellectually rigorous, with strong analytical skills and a passion for data;
  • 软件服务|咨询,IT技术服务|咨询 / 不需要融资 / 50-150人
    外企银行 双休 不加班 实际有五个岗位: 4-9年:数据分析,deep learning, NLP, transformer model 3-6年:数据开发,Hadoop, Spark, Kafka, SQL, Flink 3-6年:数据开发,Python, SQL, pandas, database 2-6年:Machine Learning Engineer,Python, Docker, CI/CD, Machine Learning and/or ML model monitoring experience a plus 6-9年:Experience in HDFS, Map Reduce, Hive, Impala, Sqoop, Linux/Unix technologies. Spark is an added advantage. Job Duties & Responsibilities: • Support finance regulatory reporting projects & applications as ETL developer | Big Data applications by following Agile Software development life cycle | level 2/3 support of the data warehouse.
  • 25k-35k·13薪 经验5-10年 / 本科
    消费生活 / 不需要融资 / 2000人以上
    1. Who you are As a person you are motivated to continuously develop and enhance programming skills and knowledge of machine learning and AI applications. The IKEA Business and our values and how they apply to the data management process is your passion. Furthermore, you are energized by working both independently and interdependently with the architecture network and cross functions. Appreciate the mix of strategic thinking and turning architecture trends into practice. Last but not least you share and live the IKEA culture and values. In this role you have proven advanced training in (computer) engineering, computer science, econometrics, mathematics or equivalent. You have experience and knowledge of working with large datasets and distributed computing architectures. You have the knowledge of coding (R and/or Python) and development of artificial intelligence applications. You have experience in statistical analysis and statistical software. In addition, you are confident in data processing and analysis. You have demonstrable experience in working in an Agile or DevOps working set-up You have knowledge in following areas: • Knowledge of data set processes for data modelling, mining and production, as well as visualization tools e.g. Tableau • Knowledge of at least one building block of artificial intelligence • Knowledge of machine learning development languages (R and/or Python) and other developments in the fast moving data technology landscape e.g. Hive or Hadoop • Knowledge of DevOps and agile development practices • Exposure to development of industrialised analytical software products • Knowledge of IKEAs corporate identity, core values and vision of creating a better everyday life for the many people We believe that you are able to work with large amounts of raw data and feel comfortable with working with different programming languages. High intellectual curiosity with ability to develop new knowledge and skills and use new concepts, methods, digital systems and processes to improve performance. The ability to provide input to user-story prioritisation as appropriate based on new ideas, approaches, and strategies. The ability to understand the complexity of IKEA business and the role of data and information as an integrated part of the business. 2. Your responsibilities As Data Scientist you will perform predictive and prescriptive modelling and help develop and deploy artificial intelligence and machine learning algorithms to reinvent and grow the IKEA business in a fast changing digital world. You will: • Explore and examine data for use in predictive and prescriptive modelling to deliver insights to the business stakeholders and support them to make better decisions • Influence product and strategy decision making by helping to visualize complex data and enable change through knowledge of the business drivers that make the product successful • Support senior colleagues with modelling projects, identifying requirements and build tools that are statistically grounded, explainable and able to adapt to changing attributes • Support the development and deployment of different analytical building blocks that support the requirements of the capability area and customer needs • Use root cause research to identify process breakdowns and provide data through the use of various skill sets to find solutions to the breakdown • Work closely with other data scientists and across functions to help produce all required design specifications and ensure that data solutions work together and fulfill business needs • Work across initiatives within INGKA Group, steering towards data-driven solutions
  • 40k-60k 经验5-10年 / 本科
    汽车丨出行 / 上市公司 / 2000人以上
    The Role We are looking for a Data Engineer to be part of our Applications Engineering team. This person will design, develop, maintain and support our Enterprise Data Warehouse & BI platform within Tesla using various data & BI tools, this position offers unique opportunity to make significant impact to the entire organization in developing data tools and driving data driven culture. Responsibilities: • Work in a time constrained environment to analyze, design, develop and deliver Enterprise Data Warehouse solutions for Tesla’s Sales, Delivery and Logistics Teams. • Setting up, maintaining and optimizing bigdata platform for production usage in reporting, analysis and ML applications. • Establish scalable, efficient, automated processes for data analyses, model development, validation and implementation. • Create ETL pipelines using Spark/Flink. • Create real time data streaming and processing using Open source technologies like Kafka , Spark etc. • Develop collaborative relationships with key business sponsors and IT resources for the efficient resolution of work requests. • Provide timely and accurate estimates for newly proposed functionality enhancements critical situation. • Develop, enforce, and recommend enhancements to Applications in the area of standards, methodologies, compliance, and quality assurance practices; participate in design and code walkthroughs. Qualifications: Minimum Qualifications: • Proficient experience in building large scale Spark batch applications. • Must have strong experience in Data Warehouse ETL design and development, methodologies, tools, processes and best practices. • Strong experience in stellar dashboards and reports creation for C-level executives. • Have a good understanding on one module, one id and one service. Preferred Qualifications: • 3+ years of development experience in Open Source technologies like Scala, Java. • Excellent Experience with Hortonworks/Cloudera platforms. • Practical experience in using HDFS. • Relevant working experience with Docker and Kubernetes preferred.
  • 30k-35k·15薪 经验3-5年 / 本科
    汽车丨出行 / 未融资 / 500-2000人
    BMW is focusing on the development of automated driving (AD) with great effort. In order to develop and test highly intelligent software for environmental perception, maneuver planning and acting, which finally enables the self-driving capability of an autonomous car, a huge amount of environmental sensor data needs to be collected, hosted and processed in an AD Development Platform (AD Data Center). The processing covers e.g. broad data analytics, reprocessing and KPI evaluation, simulation, virtual endurance run, etc. The BMW AD team in China is fully integreated into the global network of highly educated experts of different domains. In this job you will play a crucial role at the interfaces between BMW AD SW development and IT teams as well as Chinese internet tech-companies and high-tech AD suppliers. Big data, distributed data and RaaS(Reprocessing as a Service) (e.g. MapReduce, HDFS,Airflow,Jenkins,CI\CD etc.,) will be your core focus. Your prime target will be to accelerate the AD product development by analyse FTs requriments and designing big data architectures , tasking the development of related big data applications and managing their deployment to a massive off-site data center BMW just developed in cooperation with a big Chinese Tech-Player. In order to achieve this target you will be exposed to a complex project environment involving both BMW internal and external partners. You will be working with are Docker, Openshift, Kubernetes, Grafana, SPARK etc. switch global environment to *****. Major Responsibilities: -Design, deployment and improvement of distributed, big data systems targeting automated driving applications; -Tasking the development and improvement of off-board backend big data applications and related communication protocols targeting automated driving use cases; -Management and structuring of requirements provided by BMW's AD development teams regarding the big data AD platform; -Steering of BMW China AD development teams and external cooperation partners (big Chinese Tech-Player) regarding data driven development; -Review, monitor and report the status of own project; -Support in the budget planning process for the organization unit; -Research, recommend, design and develop our Big Data Architecture. Ensures system, technical and product architectures are aligned with China data platfrom strategy; Qualifications: -Master/Bachelor’s degree in Computer Science,Electrical/Electronic Engineering, Information Technology or another related field or Equivalent; -Good communication skills, good language skills in English; German language skills optional and appreciated; -Rich experience in big data processing Hadoop/Spark ecosystem applications like Hadoop, Hive, Spark and Kafka preferred; -Solid programming skills, like Java, Scala or Python; -Rich experience in docker and Kubernetes; -Familiar with CI/CD tools, like Jenkins and Ansible; -Substantial knowledge and experience in software development and engineering in (distributed big data) cloud/ compute/ data lake/datacenter systems (MAPR,Openshift,docker etc.,) -Experience with scheduling tools is a plus, like Airflow and Oozie preferred; -Experience with AD data analysis e.g. Lidar, Radar, image/video data and CAN bus data. -Experience in building and maintaining data quality standards and processes; -Strong background working with Linux/UNIX environments; -Strong Shell/Perl/Ruby/Python scripting experience; -Experience with other NoSQL databases is a plus, like Elasticsearch and HBase preferred; -Solid communication skills with ability to communicate sophisticated technical concepts and align on decisions with global and China ***** partners; -Passion to stay on top of the latest happenings in the tech world and an attitude to discuss and bring those into play; -Must have strong hands-on experience in data warehouse ETL design and development, be able to build scalable and complex ETL pipelines for different source data with different format;
  • 12k-18k 经验1-3年 / 大专
    信息安全,数据服务 / 不需要融资 / 15-50人
    Roles & Responsibilities Responsible for designing, developing, and maintaining business intelligence solutions using Tableau. Manage MS SQL databases and develop SQL scripts – views and stored procedures. Work on SSIS platform for data integration and workflow applications. Creating data models to support the design and development of Tableau reports and dashboards. Performing unit testing of reports and dashboards to ensure that they meet the specifications and requirements. Collaborate cross-functional teams, business analysts and stakeholders to understand the data requirements and design visualizations that provide insights and support decision-making. Documentation if the design, development, and maintenance of reports and dashboards, such as creating user manuals and training materials for end-users. Job Requirements To have minimum of 2 years of relevant experience Knowledge of SQL & Tableau Is a MUST requirement Knowledge of MS SQL/ SSIS is a must. Knowledge in Excel VBA/ Python is a big advantage. Skills in Microsoft office like Word, PowerPoint and Excel is essential. Strong verbal and written communication.
  • 20k-25k 经验3-5年 / 本科
    人工智能服务 / A轮 / 15-50人
    Description: We are looking for an experienced data engineer to join our team. You will use various methods to transform raw data into useful data systems. For example, you’ll analysis existing data flow and design the data flow between sub-systems, Establish ETL process, to create algorithms and conduct statistical analysis. Overall, you’ll strive for efficiency by aligning data systems with business goals.To succeed in this data engineering position, you should have strong analytical skills and the ability to combine data from different sources. Data engineer skills also include familiarity with several programming languages and knowledge of learning machine methods. If you are detail-oriented, with excellent organizational skills and experience in this field, we’d like to hear from you. As a Mar-tech centric team, it is better that you have the domain knowledge of CDP (Customer Data Platform), DMP (Data Management Platform), CRM, SCRM Responsibilities: Analyze and organize raw data Analyze existing systems and data flow, Build data systems and pipelines Evaluate business needs and objectives Interpret trends and patterns Conduct complex data analysis and report on results Prepare data for prescriptive and predictive modeling Build algorithms and prototypes Combine raw information from different sources Explore ways to enhance data quality and reliability Identify opportunities for data acquisition Develop analytical tools and programs Collaborate with data scientists and architects on several projects Hands-on establish Data Warehouse-CDP-CRM-Marketing Tool Requirements: Previous experience as a data engineer or in a similar role Technical expertise with data models, data mining, and segmentation techniques Knowledge of programming languages (e.g. Java and Python) Hands-on experience with SQL database design Great numerical and analytical skills Knowledge and working experience of CDP, CRM, familiar with third-party vendors products, such as Salesforce, Segment, etc. Degree in Computer Science, IT, or similar field; a Master’s is a plus Data engineering certification (e.g IBM Certified Data Engineer) is a plus Entrepreneurial and concentration
  • 15k-18k 经验不限 / 硕士
    金融 / 不需要融资 / 15-50人
    0x0000-Duty 1)Maintenance of existing data platform: including but not limited to pushing matters about automatic operation and maintenance of data. 2)Iterative research and development of data system: including but not limited to optimizing the storage structure and storage mode of data source to improve the operation speed and storage efficiency of data source. 3)Other data related work, such as cleaning and verification of relevant data. 0x0001-Requirement 1)Computer science, software engineering, information management, math, big data or other science and engineering related majors. 2)Be familiar with Python or at least one other mainstream programming language, and be familiar with the use of basic commands in Linux working environment. 3)Skilled in SQL language and the experience of using at least two kinds of database engine. 4)Have relevant basic knowledge of big data. It is best to have some application-experience. 0x0002-Bonus 1)Relevant R & D experience in time-series database or HDF5. 2)MCM, ACM-ICPC or others award. 3)Fund qualification certificate. 0x0000-职责 1)现有数据平台维护:包括但不局限于推进数据的自动化运维相关事项。 2)数据体系迭代研发:包括但不局限于通过对数据源存储结构、存储模式等方面进行优化,提高数据源的运算速度及存储效率。 3)其他与数据相关的工作,如相关数据的清洗、校验等。 0x0001-要求 1)计算机、软件工程、大数据、应用数学、计算科学、信息管理等理工类相关专业。 2)熟练使用Python或其他至少一种主流编程语言,熟悉Linux工作环境下基本命令的使用。 3)熟练使用SQL语言,使用过至少两种数据库。 4)拥有大数据的相关基础知识,有相关经验最佳。 0x0002-加分项 1)时序数据库或HDF5相关研发经验。 2)MCM、ACM-ICPC等奖项。 3)基金从业资格证