Friday, 26 January 2024

Build Your Own Programming Language: A programmer's guide to designing compilers, DSLs and interpreters for solving modern computing problems, 2nd Edition

 


Written by the creator of the Unicon programming language, this book will show you how to implement programming languages to reduce the time and cost of creating applications for new or specialized areas of computing.

Key Features

  • Solve pain points in your application domain by building a custom programming language
  • Learn how to create parsers, code generators, semantic analyzers, and interpreters
  • Target bytecode, native code, and preprocess or transpile code into another high level language

Book Description

The need for different types of computer languages is growing, as is the need for domain-specific languages. Building your own programming language has its advantages, as it can be your antidote to the ever-increasing complexity of software.

In this book, you'll start with implementing the frontend of a compiler for your language, including a lexical analyzer and parser, including the handling of parse errors. The book then covers a series of traversals of syntax trees, culminating with code generation for a bytecode virtual machine or native code. You’ll also manage data structures and output code when writing a preprocessor or a transpiler.

Moving ahead, you'll learn how domain-specific language features are often best represented by operators and functions that are built into the language, rather than library functions. We'll conclude with how to implement garbage collection. Throughout the book, Dr. Jeffery weaves in his experience from building the Unicon programming language to give better context to the concepts. Relevant examples are provided in Unicorn and Java so that you can follow the code of your choice. In this edition, code examples have been extended and further tested.

By the end of this book, you'll be able to build and deploy your own domain-specific languages, capable of compiling and running programs.

What you will learn

  • Perform requirements analysis for the new language and design language syntax and semantics
  • Write lexical and context-free grammar rules for common expressions and control structures
  • Develop a scanner that reads source code and generate a parser that checks syntax
  • Build key data structures in a compiler and use your compiler to build a syntax-coloring code editor
  • Write tree traversals that insert information into the syntax tree
  • Implement a bytecode interpreter and run bytecode generated by your compiler
  • Write native code and run it after assembling and linking using system tools
  • Preprocess and transpile code from your language into another high level language
  • Implement garbage collection in your language

Who This Book Is For

This book is for software developers interested in the idea of inventing their own language or developing a domain-specific language. Computer science students taking compiler construction courses will also find this book highly useful as a practical guide to language implementation to supplement more theoretical textbooks. We assume most readers will have intermediate or better proficiency in a high level programming language such as Java or C++.

Table of Contents

  1. Why Build Another Programming Language?
  2. Programming Language Design
  3. Scanning Source Code
  4. Parsing
  5. Syntax Trees
  6. Symbol Tables
  7. Checking Base Types
  8. Checking Types on Function Calls and Structure Accesses
  9. Intermediate Code Generation
  10. Syntax Coloring in an IDE
  11. Preprocessors and Transpilers
  12. Bytecode Interpreters
  13. Generating Bytecode
  14. Native Code Generation
  15. Built in Operators and Functions
  16. Control Structures   

Hard Copy : Build Your Own Programming Language: A programmer's guide to designing compilers, DSLs and interpreters for solving modern computing problems, 2nd Edition



How much do you know about functional programming in Python?

 


a. lambda function cannot be used with reduce( ) function.

Answer

False

b. lambda, map( ), filter( ), reduce( ) can be combined in one single

expression.

Answer

True

c. Though functions can be assigned to variables, they cannot be called

using these variables.

Program

False

d. Functions can be passed as arguments to function and returned from

function.

Program

True

e. Functions can be built at execution time, the way lists, tuples, etc. can

be.

Program

True

f. Lambda functions are always nameless.

Program

True

Thursday, 25 January 2024

DevOps on AWS: Release and Deploy

 


Build your subject-matter expertise

This course is part of the DevOps on AWS Specialization

When you enroll in this course, you'll also be enrolled in this Specialization.

Learn new concepts from industry experts

Gain a foundational understanding of a subject or tool

Develop job-relevant skills with hands-on projects

Earn a shareable career certificate

Join Free: DevOps on AWS: Release and Deploy

There are 2 modules in this course

AWS provides a set of flexible services designed to enable companies to more rapidly and reliably build and deliver products using AWS and DevOps practices. These services simplify provisioning and managing infrastructure, deploying application code, automating software release processes, and monitoring your application and infrastructure performance. 

The third course in the series explains how to improve the deployment process with DevOps methodology, and also some tools that might make deployments easier, such as Infrastructure as Code, or IaC, and AWS CodeDeploy.

The course begins with reviewing topics covered in the first course of the DevOps on AWS series. You will learn about the differences between continuous integration, continuous delivery, and continuous deployment. In Exercises 1 and 2, you will set up AWS CodeDeploy and make revisions that will then be deployed. If you use AWS Lambda, you will explore ways to address additional considerations when you deploy updates to your Lambda functions.

Next, you will explore how infrastructure as code (IaC) helps organizations achieve automation, and which AWS solutions provide a DevOps-focused way of creating and maintaining infrastructure. In Exercise 3, you will be provided with an AWS CloudFormation template that will set up backend services, such as AWS CodePipeline, AWS CodeCommit, AWS CodeDeploy, and AWS CodeBuild. You will then upload new revisions to the pipeline.

DevOps on AWS: Operate and Monitor

 


Build your subject-matter expertise

This course is part of the DevOps on AWS Specialization

When you enroll in this course, you'll also be enrolled in this Specialization.

Learn new concepts from industry experts

Gain a foundational understanding of a subject or tool

Develop job-relevant skills with hands-on projects

Earn a shareable career certificate

Join Free: DevOps on AWS: Operate and Monitor

There are 2 modules in this course

The third and the final course in the DevOps series will teach how to use AWS Services to control the architecture in order to reach a better operational state. Monitoring and Operation are key aspects for both the release pipeline and production environments, because they provide instruments that help discover what's happening, as well as do modifications and enhancements on infrastructure that is currently running. 

This course teaches how to use Amazon CloudWatch for monitoring, as well as Amazon EventBridge and AWS Config for continuous compliance. It also covers Amazon CloudTrail and a little bit of Machine Learning for Monitoring operations!

Exam Prep: AWS Certified Cloud Practitioner Foundations

 


What you'll learn

The four domains - Cloud Concepts, Security and Compliance, Technology and Billing and Pricing - for the AWS Certified Cloud Practitioner exam

Certification exam-level practice questions written by experts from AWS

Simulations designed to solidify understanding of cloud concepts you need to know for the exam

Join Free: Exam Prep: AWS Certified Cloud Practitioner Foundations

There are 4 modules in this course

This new foundational-level course from Amazon Web Services (AWS), is designed to help you to assess your preparedness for the AWS Certified Cloud Practitioner certification exam.  You will learn how to prepare for the exam by exploring the exam’s topic areas and how they map to both AWS Cloud practitioner roles and to specific areas of study. You will review sample certification questions in each domain, practice skills with hands-on exercises, test your knowledge with practice question sets, and learn strategies for identifying incorrect responses by interpreting the concepts that are being tested in the exam. At the end of this course you will have all the knowledge and tools to help you identity your strengths and weaknesses in each certification domain areas that are being tested on the certification exam. 

The AWS Certified Cloud Foundations Certification the AWS Certified Cloud Practitioner (CLF-C01) exam is intended for individuals who can effectively demonstrate an overall knowledge of the AWS Cloud independent of a specific job role. The exam validates a candidate’s ability to complete the following tasks: Explain the value of the AWS Cloud, Understand and explain the AWS shared responsibility model, understand security best practices, Understand AWS Cloud costs, economics, and billing practices, Describe and position the core AWS services, including compute, network, databases, and storage and identify AWS services for common use cases

AWS Cloud Practitioner Essentials

 


What you'll learn

Understand the working definition of the AWS Cloud

Differentiate between on-premises, hybrid-cloud, and all-in cloud

Describe the basic global infrastructure of the AWS Cloud

Explain the benefits of the AWS Cloud

Join Free: AWS Cloud Practitioner Essentials

There are 7 modules in this course

Welcome to AWS Cloud Practitioner Essentials. If you’re new to the cloud, whether you’re in a technical or non-technical role such as finance, legal, sales, marketing, this course will provide you with an understanding of fundamental AWS Cloud concepts to help you gain confidence to contribute to your organization’s cloud initiatives. This course is also the starting point to prepare for your AWS Certified Cloud Practitioner certification whenever it’s convenient for you.

After you complete the course, you’ll understand the benefits of the AWS Cloud and the basics of its global infrastructure. You’ll be able to describe and provide an example of the core AWS services, including compute, network, databases, and storage. For the finance-minded, you’ll be able to articulate the financial benefits of the AWS Cloud, define core billing and pricing models, and learn how to use pricing tools to make cost-effective choices for AWS services.

Migrating to the AWS Cloud

 


Build your subject-matter expertise

This course is part of the AWS Fundamentals Specialization

When you enroll in this course, you'll also be enrolled in this Specialization.

Learn new concepts from industry experts

Gain a foundational understanding of a subject or tool

Develop job-relevant skills with hands-on projects

Earn a shareable career certificate

Join Free: Migrating to the AWS Cloud

There are 4 modules in this course

This introductory course is for anyone who wants a deeper dive into AWS migration. Whether you want to understand what services are helpful, need to plan a migration for your organization, or are helping other groups with their own migration, you will find valuable information throughout this course. The course sessions structure cloud migration through the three-phase migration process from AWS: assess, mobilize, and migrate and modernize. This process is designed to help your organization approach and implement a migration of tens, hundreds, or thousands of applications. By learning about this three-phase structure—and the various AWS tools, features, and services that can help you during each phase—you will complete this course with a better understanding of how to design and implement migrations to AWS.

Architecting Solutions on AWS

 


Build your subject-matter expertise

This course is available as part of 

When you enroll in this course, you'll also be asked to select a specific program.

Learn new concepts from industry experts

Gain a foundational understanding of a subject or tool

Develop job-relevant skills with hands-on projects

Earn a shareable career certificate

Join Free: Architecting Solutions on AWS

There are 4 modules in this course

Are you looking to get more technical? Are you looking to begin working in the cloud, but don’t know where to go next? Are you looking to up your game by prepping for the AWS Solutions Architect Associate Exam? Do you see yourself as a cloud consultant, but can’t quite envision how your days would be? Are you puzzled how to match a customer’s requirements with the right AWS services/solutions? If so, you are in the right place!! You’ll learn how to plan, think, and act like a Solution Architect in a real-life customer scenario.

In this course, you’ll get prepared to begin your career architecting solutions on AWS. Through a series of use case scenarios and practical learning, you’ll learn to identify services and features to build resilient, secure, and highly available IT solutions in the AWS Cloud. Each week, a fictional customer will present a different need. We will then review the options, choose the best one for the use case and walk you through the architecture design on a whiteboard. You’ll learn about event-driven architectures with a focus on performance efficiency and cost. You’ll then gain knowledge on how to architect a solution using many purpose-built AWS services. With this understanding, you’ll get a sense of hybrid architectures with a refined focus on reliability and operational efficiency. Finally, you’ll wrap up your learning by understanding a multi-account strategy centered on security and cost.

AWS Cloud Solutions Architect Professional Certificate

 


What you'll learn

Make informed decisions about when and how to apply key AWS Services for compute, storage, database, networking, monitoring, and security.

Design architectural solutions, whether designing for cost, performance, and/or operational excellence, to address common business challenges.

Create and operate a data lake in a secure and scalable way, ingest and organize data into the data lake, and optimize performance and costs.

Prepare for the certification exam, identify your strengths and gaps for each domain area, and build strategies for identifying incorrect responses.

Join Free: AWS Cloud Solutions Architect Professional Certificate

Professional Certificate - 4 course series

This professional certificate provides the knowledge and skills you need to start building your career in cloud architecture and helps you prepare for the AWS Certified Solutions Architect - Associate exam. You will start by learning key AWS Services for compute, storage, database, networking, monitoring, and security, then dive into how to design architectural solutions, how to create and operate a data lake, and how to prepare for the certification exam.

The AWS Certified Solutions Architect – Associate certification showcases knowledge and skills in AWS technology across a wide range of AWS services. The certification focuses on the design of cost and performance optimized solutions and demonstrating a strong understanding of the AWS Well-Architected Framework. This AWS Certification is one of the top-paying IT certifications, per the 
 SkillSoft IT Skills and Salary report
. Per
 Enterprise Strategy Group
, surveyed AWS Certification holders credited their certification for their higher earnings (74%), increased confidence (87%), and increased influence among coworkers (79%).

To prepare for your AWS Certification exam, we recommend that — in addition to attaining this professional certificate — candidates review the free exam guide, sample questions, and AWS technical documentation (e.g. white papers and product FAQs) on the
 AWS Certified Solutions Architect - Associate exam page
 to understand what content and services are covered by the exam.

Applied Learning Project

Through 15 hands-on labs, you’ll use the AWS Management Console to apply skills learned in the videos. 

For example: 

In Architecting Solutions on AWS, you’ll use Amazon API Gateway, AWS Lambda, Amazon SQS, Amazon DynamoDB, and Amazon SNS to build a serverless web backend.

In Introduction to Designing Data Lakes, you’ll use Amazon S3, Amazon OpenSearch Service, AWS Lambda and Amazon API Gateway to create an Amazon OpenSearch Service Cluster. You’ll also use Amazon S3, Amazon EC2, Amazon Kinesis Data Firehose, Amazon Kinesis Data Analytics, Amazon Elasticsearch Service to create a data ingestion pipeline with the use of high-scale AWS Managed services. 

In Cloud Technical Essentials, you’ll design a 3-tier architecture using services like Amazon VPC, Amazon EC2, Amazon RDS with high availability and Elastic Load Balancing following AWS best practices. You’ll upload an architecture diagram laying out your design including the networking layer.

AWS Cloud Technical Essentials

 


What you'll learn

Describe terminology and concepts related to AWS services     

Articulate key concepts of AWS security measures and AWS Identity and Access Management (IAM)    

You will learn to distinguish among several AWS compute services, including Amazon EC2, AWS Lambda, and Amazon ECS.  

Understand AWS database and storage offerings, including Amazon Relational Database Service (Amazon RDS), Amazon DynamoDB, and Amazon S3.

Join Free: AWS Cloud Technical Essentials

There are 4 modules in this course

Are you in a technical role and want to learn the fundamentals of AWS? Do you aspire to have a job or career as a cloud developer, architect, or in an operations role? If so, AWS Cloud Technical Essentials is an ideal way to start. This course was designed for those at the beginning of their cloud-learning journey - no prior knowledge of cloud computing or AWS products and services required!

Throughout the course, students will build highly available, scalable, and cost effective application step-by-step. Upon course completion, you will be able to make an informed decision about when and how to apply core AWS services for compute, storage, and database to different use cases. You’ll also learn about cloud security with a review of AWS' shared responsibility model and an introduction to AWS Identity and Access Management (IAM). And, you’ll know how AWS services can be used to monitor and optimize infrastructure in the cloud.

AWS Cloud Technical Essentials is a fundamental-level course and will build your competence, confidence, and credibility with practical cloud skills that help you innovate and advance your professional future. Enroll in AWS Cloud Technical Essentials and start learning the technical fundamentals of AWS today!

Note: This course was designed for learners with a technical background. If you are new to the cloud or come from a business background, we recommend completing AWS Cloud Practitioner Essentials (https://www.coursera.org/learn/aws-cloud-practitioner-essentials) before enrolling in this course.

Serverless Architectures on AWS

 


Build your subject-matter expertise

This course is part of the Developing Applications on AWS Specialization

When you enroll in this course, you'll also be enrolled in this Specialization.

Learn new concepts from industry experts

Gain a foundational understanding of a subject or tool

Develop job-relevant skills with hands-on projects

Earn a shareable career certificate

Join Free: Serverless Architectures on AWS

There are 2 modules in this course

A modern software engineer knows how to use the benefits of managed services from Amazon Web Services to reduce the coding needed to get a project across the line. There’s a lot of code you really don’t need to write when you can use a managed service for your applications. Less code means less tests, less bugs, and quicker delivery. 

In this course, we get hands on with automation tools and serverless managed services. Get your projects completed faster without needing to maintain the underlying servers hosting the managed services. Treat your infrastructure as code using AWS CloudFormation and AWS Serverless Application Model as an automated way to build the resources hosting your applications. We use AWS Amplify to rapidly add front-end hosting and AWS Cognito to add authentication to our application. With Cognito in place, we upgrade the application API to require authentication. Next, we learn to use AWS Step Functions to move a lot of the workflow coordination code out of your applications. Using serverless services, we contrast some options for building event driven architectures with Amazon SNS, Amazon SQS and Amazon EventBridge. Join our expert instructors as we dive deep on real-world use cases for each of the featured services in the course. 

This course will provide a combination of video-based lectures, demonstrations and hands-on lab exercises that will get you working with automation tools, Cognito authentication, Step Function workflows and event-driven architectures.

AWS Fundamentals Specialization

 


Advance your subject-matter expertise

Learn in-demand skills from university and industry experts

Master a subject or tool with hands-on projects

Develop a deep understanding of key concepts

Earn a career certificate from Amazon Web Services

Join Free: AWS Fundamentals Specialization

Specialization - 3 course series

This specialization gives current or aspiring IT professionals an overview of the features, benefits, and capabilities of Amazon Web Services (AWS). As you proceed through these four interconnected courses, you will gain a more vivid understanding of core AWS services, key AWS security concepts, strategies for migrating from on-premises to AWS, and basics of building serverless applications with AWS. Additionally, you will have opportunities to practice what you have learned by completing labs and exercises developed by AWS technical instructors.

Applied Learning Project

This specialization gives current or aspiring IT professionals an overview of the features, benefits, and capabilities of Amazon Web Services (AWS). As you proceed through these four interconnected courses, you will gain a more vivid understanding of core AWS services, key AWS security concepts, strategies for migrating from on-premises to AWS, and basics of building serverless applications with AWS. Additionally, you will have opportunities to practice what you have learned by completing labs and exercises developed by AWS technical instructors.

Introduction to Machine Learning on AWS

 


What you'll learn

Differentiate between artificial intelligence (AI), machine learning, and deep learning. 

Select the appropriate AWS machine learning service for a given use case.

Discover how to build, train, and deploy machine learning models.

Join Free: Introduction to Machine Learning on AWS

There are 2 modules in this course

In this course, we start with some services where the training model and raw inference is handled for you by Amazon. We'll cover services which do the heavy lifting of computer vision, data extraction and analysis, language processing, speech recognition, translation, ML model training and virtual agents. You'll think of your current solutions and see where you can improve these solutions using AI, ML or Deep Learning. All of these solutions can work with your current applications to make some improvements in your user experience or the business needs of your application.

Learn SQL Basics for Data Science Specialization

 


What you'll learn

Use SQL commands to filter, sort, & summarize data; manipulate strings, dates, & numerical data from different sources for analysis

Assess and create datasets to solve your business questions and problems using SQL

Use the collaborative Databricks workspace and create an end-to-end pipeline that reads data, transforms it, and saves the result

Develop a project proposal & select your data, perform statistical analysis & develop metrics, and present your findings & make recommendations

Join Free: Learn SQL Basics for Data Science Specialization

Specialization - 4 course series

This Specialization is intended for a learner with no previous coding experience seeking to develop SQL query fluency. Through four progressively more difficult SQL projects with data science applications, you will cover topics such as SQL basics, data wrangling, SQL analysis, AB testing, distributed computing using Apache Spark, Delta Lake and more. These topics will prepare you to apply SQL creatively to analyze and explore data; demonstrate efficiency in writing queries; create data analysis datasets; conduct feature engineering, use SQL with other data analysis and machine learning toolsets; and use SQL with unstructured data sets. 


Data Visualization and Dashboards with Excel and Cognos

 


What you'll learn

Create basic visualizations such as line graphs, bar graphs, and pie charts using Excel spreadsheets.

Explain the important role charts play in telling a data-driven story. 

Construct advanced charts and visualizations such as Treemaps, Sparklines, Histogram, Scatter Plots, and Filled Map Charts.

Build and share interactive dashboards using Excel and Cognos Analytics.

Join Free: Data Visualization and Dashboards with Excel and Cognos

There are 4 modules in this course

Learn how to create data visualizations and dashboards using spreadsheets and analytics tools. This course covers some of the first steps for telling a compelling story with your data using various types of charts and graphs. You'll learn the basics of visualizing data with Excel and IBM Cognos Analytics without having to write any code. 

You'll start by creating simple charts in Excel such as line, pie and bar charts. You will then create more advanced visualizations with Treemaps, Scatter Charts, Histograms, Filled Map Charts, and Sparklines. Next you’ll also work with the Excel PivotChart feature as well as assemble several visualizations in an Excel dashboard.  

This course also teaches you how to use business intelligence (BI) tools like Cognos Analytics  to create interactive dashboards. By the end of the course you will have an appreciation for the key role that data visualizations play in communicating your data analysis findings, and the ability to effectively create them. 

Throughout this course there will be numerous hands-on labs to help you develop practical experience for working with Excel and Cognos. There is also a final project in which you’ll create a set of data visualizations and an interactive dashboard to add to your portfolio, which you can share with peers, professional communities or prospective employers.

Mathematics for Machine Learning: Linear Algebra

 


Build your subject-matter expertise

This course is part of the Mathematics for Machine Learning Specialization

When you enroll in this course, you'll also be enrolled in this Specialization.

Learn new concepts from industry experts

Gain a foundational understanding of a subject or tool

Develop job-relevant skills with hands-on projects

Earn a shareable career certificate

Join Free: Mathematics for Machine Learning: Linear Algebra

There are 5 modules in this course

In this course on Linear Algebra we look at what linear algebra is and how it relates to vectors and matrices. Then we look through what vectors and matrices are and how to work with them, including the knotty problem of eigenvalues and eigenvectors, and how to use these to solve problems. Finally  we look at how to use these to do fun things with datasets - like how to rotate images of faces and how to extract eigenvectors to look at how the Pagerank algorithm works.

Since we're aiming at data-driven applications, we'll be implementing some of these ideas in code, not just on pencil and paper. Towards the end of the course, you'll write code blocks and encounter Jupyter notebooks in Python, but don't worry, these will be quite short, focussed on the concepts, and will guide you through if you’ve not coded before.

At the end of this course you will have an intuitive understanding of vectors and matrices that will help you bridge the gap into linear algebra problems, and how to apply these concepts to machine learning.

Introduction to Probability and Data with R

 


Build your subject-matter expertise

This course is part of the Data Analysis with R Specialization

When you enroll in this course, you'll also be enrolled in this Specialization.

Learn new concepts from industry experts

Gain a foundational understanding of a subject or tool

Develop job-relevant skills with hands-on projects

Earn a shareable career certificate

Join Free: Introduction to Probability and Data with R

There are 8 modules in this course

This course introduces you to sampling and exploring data, as well as basic probability theory and Bayes' rule. You will examine various types of sampling methods, and discuss how such methods can impact the scope of inference. A variety of exploratory data analysis techniques will be covered, including numeric summary statistics and basic data visualization. You will be guided through installing and using R and RStudio (free statistical software), and will use this software for lab exercises and a final project. The concepts and techniques in this course will serve as building blocks for the inference and modeling courses in the Specialization.

Extract, Transform and Load Data in Power BI

 


What you'll learn

How to set up a data source and explain and configure storage modes in Power BI.

How to prepare for data modeling by cleaning and transforming data.

How to use profiling tools to identify data anomalies.

How to reference queries and dataflows and use the Advanced Editor to modify code. 

Join Free: Extract, Transform and Load Data in Power BI

There are 4 modules in this course

This course forms part of the Microsoft Power BI Analyst Professional Certificate. This Professional Certificate consists of a series of courses that offers a good starting point for a career in data analysis using Microsoft Power BI.

In this course, you will learn the process of Extract, Transform and Load or ETL. You will identify how to collect data from and configure multiple sources in Power BI and prepare and clean data using Power Query. You’ll also have the opportunity to inspect and analyze ingested data to ensure data integrity. 

After completing this course, you’ll be able to: 

Identify, explain and configure multiple data sources in Power BI  
Clean and transform data using Power Query  
Inspect and analyze ingested data to ensure data integrity

This is also a great way to prepare for the Microsoft PL-300 exam. By passing the PL-300 exam, you’ll earn the Microsoft Power BI Data Analyst certification.

Data Visualization with Tableau Specialization

 


What you'll learn

Examine, navigate, and learn to use the various features of Tableau

Assess the quality of the data and perform exploratory analysis

 Create and design visualizations and dashboards for your intended audience

 Combine the data to and follow the best practices to present your story

Join Free: Data Visualization with Tableau Specialization

Specialization - 5 course series

In 2020 the world will generate 50 times the amount of data as in 2011. And 75 times the number of information sources (IDC, 2011). Being able to use this data provides huge opportunities and to turn these opportunities into reality, people need to use data to solve problems.

 This Specialization, in collaboration with Tableau, is intended for newcomers to data visualization with no prior experience using Tableau. We leverage Tableau's library of resources to demonstrate best practices for data visualization and data storytelling. You will view examples from real world business cases and journalistic examples from leading media companies. 

By the end of this specialization, you will be able to generate powerful reports and dashboards that will help people make decisions and take action based on their business data. You will use Tableau to create high-impact visualizations of common data analyses to help you see and understand your data. You will apply predicative analytics to improve business decision making.  The Specialization culminates in a Capstone Project in which you will use sample data to create visualizations, dashboards, and data models to prepare a presentation to the executive leadership of a fictional company.

Microsoft Power BI Data Analyst Professional Certificate

Microsoft Power BI Data Analyst Professional Certificate

 


What you'll learn

Learn to use Power BI to connect to data sources and transform them into meaningful insights.  

Prepare Excel data for analysis in Power BI using the most common formulas and functions in a worksheet.     

Learn to use the visualization and report capabilities of Power BI to create compelling reports and dashboards.  

Demonstrate your new skills with a capstone project and prepare for the industry-recognized Microsoft PL-300 Certification exam.  

Join Free: Microsoft Power BI Data Analyst Professional Certificate

Professional Certificate - 8 course series

Learners who complete this program will receive a 50% discount voucher to take the PL-300 Certification Exam. 

Business Intelligence analysts are highly sought after as more organizations rely on data-driven decision-making. Microsoft Power BI is the leading data analytics, business intelligence, and reporting tool in the field, used by 97% of Fortune 500 companies to make decisions based on data-driven insights and analytics.1 Prepare for a new career in this high-growth field with professional training from Microsoft — an industry-recognized leader in data analytics and business intelligence.

Through a mix of videos, assessments, and hands-on activities, you will engage with the key concepts of Power BI, transforming data into meaningful insights and creating compelling reports and dashboards. You will learn to prepare data in Excel for analysis in Power BI, form data models using the Star schema, perform calculations in DAX, and more.

In your final project, you will showcase your new Power BI and data analysis skills using a real-world scenario. When you complete this Professional Certificate, you’ll have tangible examples to talk about in your job interviews and you’ll also be prepared to take the industry-recognized PL-300: Microsoft Power BI Data Analyst certification exam.


1Microsoft named a Leader in the 2023 Gartner® Magic Quadrant™ for Analytics and BI Platforms (April 2023)

Applied Learning Project

This program has been uniquely mapped to key job skills required in a Power BI data analyst role. In each course, you’ll be able to consolidate what you have learned by completing a project that simulates a real-world data analysis scenario using Power BI. You’ll also complete a final capstone project where you’ll showcase all your new Power BI data analytical skills.

The projects will include:

● A real-world scenario where you connect to data sources and transform data into an optimized data model for data analysis. 

● A real-world scenario where you demonstrate data storytelling through dashboards, reports and charts to solve business challenges and identify new opportunities.

A real-world capstone project where you analyze the performance of a multinational business and prepare executive dashboards and reports.

To round off your learning, you’ll take a mock exam that has been set up in a similar style to the industry-recognized Exam PL-300: Microsoft Power BI Data Analyst.

Data Analysis with R Programming

 


What you'll learn

Describe the R programming language and its programming environment.

Explain the fundamental concepts associated with programming in R including functions, variables, data types, pipes, and vectors.

Describe the options for generating visualizations in R.

Demonstrate an understanding of the basic formatting in R Markdown to create structure and emphasize content.

Join Free: Data Analysis with R Programming

There are 5 modules in this course

This course is the seventh course in the Google Data Analytics Certificate. In this course, you’ll learn about the programming language known as R. You’ll find out how to use RStudio, the environment that allows you to work with R, and the software applications and tools that are unique to R, such as R packages. You’ll discover how R lets you clean, organize, analyze, visualize, and report data in new and more powerful ways. Current Google data analysts will continue to instruct and provide you with hands-on ways to accomplish common data analyst tasks with the best tools and resources.

Learners who complete this certificate program will be equipped to apply for introductory-level jobs as data analysts. No previous experience is necessary.

By the end of this course, you will:

- Examine the benefits of using the R programming language.
- Discover how to use RStudio to apply R to your analysis. 
- Explore the fundamental concepts associated with programming in R. 
- Understand the contents and components of R packages including the Tidyverse package.
- Gain an understanding of dataframes and their use in R.
- Discover the options for generating visualizations in R.
- Learn about R Markdown for documenting R programming.

IBM Data Science Professional Certificate

 


What you'll learn

Master the most up-to-date practical skills and knowledge that data scientists use in their daily roles

Learn the tools, languages, and libraries used by professional data scientists, including Python and SQL

Import and clean data sets, analyze and visualize data, and build machine learning models and pipelines

Apply your new skills to real-world projects and build a portfolio of data projects that showcase your proficiency to employers

Join Free: IBM Data Science Professional Certificate

Professional Certificate - 10 course series

Prepare for a career in the high-growth field of data science. In this program, you’ll develop the skills, tools, and portfolio to have a competitive edge in the job market as an entry-level data scientist in as little as 5 months. No prior knowledge of computer science or programming languages is required. 

Data science involves gathering, cleaning, organizing, and analyzing data with the goal of extracting helpful insights and predicting expected outcomes. The demand for skilled data scientists who can use data to tell compelling stories to inform business decisions has never been greater. 

You’ll learn in-demand skills used by professional data scientists including databases, data visualization, statistical analysis, predictive modeling, machine learning algorithms, and data mining. You’ll also work with the latest languages, tools,and libraries including Python, SQL, Jupyter notebooks, Github, Rstudio, Pandas, Numpy, ScikitLearn, Matplotlib, and more.

Upon completing the full program, you will have built a portfolio of data science projects to provide you with the confidence to excel in your interviews. You will also receive access to join IBM’s Talent Network where you’ll see job opportunities as soon as they are posted, recommendations matched to your skills and interests, and tips and tricks to help you stand apart from the crowd. 

This program is ACE® and FIBAA recommended —when you complete, you can earn up to 12 college credits and 6 ECTS credits.

Applied Learning Project

This Professional Certificate has a strong emphasis on applied learning and includes a series of hands-on labs in the IBM Cloud that give you practical skills with applicability to real jobs.

Tools you’ll use: Jupyter / JupyterLab, GitHub, R Studio, and Watson Studio

Libraries you’ll use: Pandas, NumPy, Matplotlib, Seaborn, Folium, ipython-sql, Scikit-learn, ScipPy, etc.

Projects you’ll complete:

Extract and graph financial data with the Pandas Python library

Use SQL to query census, crime, and school demographic data sets

Wrangle data, graph plots, and create regression models to predict housing prices with data science Python libraries

Create a dynamic Python dashboard to monitor, report, and improve US domestic flight reliability

Apply and compare machine learning classification algorithms to predict whether a loan case will be paid off or not

Train and compare machine learning models to predict if a space launch can reuse the first stage of a rocket

Wednesday, 24 January 2024

Indian Flag using NumPy and Matplotlib in Python

 


Code : 

import numpy as np
import matplotlib.pyplot as plt
import matplotlib.patches as patches
def draw_tricolor_flag():
    # Create figure and axes
    fig, ax = plt.subplots()
    # Draw tricolor bands
    colors = ['#138808', '#ffffff', '#FF6103']
    for i, color in enumerate(colors):
        rect = patches.Rectangle((0, 2*i+1), width=9, height=2, facecolor=color, edgecolor='grey')
        ax.add_patch(rect)
    # Draw Ashoka Chakra circle
    chakra_radius = 0.8
    ax.plot(4.5, 4, marker='o', markerfacecolor='#000080', markersize=9.5)
    chakra = patches.Circle((4.5, 4), chakra_radius, color='#000080', fill=False, linewidth=7)
    ax.add_artist(chakra)

    # Draw 24 spokes in Ashoka Chakra
    for i in range(24):
        angle1 = np.pi * i / 12 - np.pi / 48
        angle2 = np.pi * i / 12 + np.pi / 48
        spoke = patches.Polygon([[4.5, 4],
                                 [4.5 + chakra_radius / 2 * np.cos(angle1),
                                  4 + chakra_radius / 2 * np.sin(angle1)],
                                 [4.5 + chakra_radius * np.cos(np.pi * i / 12),
                                  4 + chakra_radius * np.sin(np.pi * i / 12)],
                                 [4.5 + chakra_radius / 2 * np.cos(angle2),
                                  4 + chakra_radius / 2 * np.sin(angle2)]],
                                fill=True, closed=True, color='#000080')
        ax.add_patch(spoke)
    # Set equal axis and display the plot
    ax.axis('equal')
    plt.show()
# Call the function to draw the tricolor flag
draw_tricolor_flag()
#clcoding.com

Explanation:


Imports:

numpy for numerical operations.
matplotlib.pyplot for creating plots.
matplotlib.patches for creating shapes like rectangles and polygons.
Function Definition (draw_tricolor_flag):

Creates a figure and axes for the plot.
Drawing Tricolor Bands:

Three rectangles are drawn to represent the tricolor bands of the flag using the colors green ('#138808'), white ('#ffffff'), and saffron ('#FF6103').
Drawing Ashoka Chakra Circle:

A circle is drawn at the center of the plot representing the Ashoka Chakra. It is outlined with a blue color ('#000080').
Drawing 24 Spokes in Ashoka Chakra:

A loop calculates the coordinates for each spoke and uses patches.Polygon to draw each spoke. The spokes are drawn in blue ('#000080').
Setting Equal Axis and Displaying the Plot:

The axis is set to be equal, ensuring an equal aspect ratio, and the plot is displayed.
The comment #clcoding.com at the end of your code appears to be a website or identifier but doesn't affect the code's functionality.


Code Explanation 

Imports:
import numpy as np
import matplotlib.pyplot as plt
import matplotlib.patches as patches
numpy is imported as np for numerical operations.
matplotlib.pyplot is imported as plt for creating plots.
matplotlib.patches is imported as patches for creating shapes like rectangles and polygons.

Function Definition:
def draw_tricolor_flag():
This defines a function named draw_tricolor_flag.

Creating Figure and Axes:
fig, ax = plt.subplots()
This creates a figure and axes for the plot.

Drawing Tricolor Bands:
colors = ['#138808', '#ffffff', '#FF6103']
for i, color in enumerate(colors):
    rect = patches.Rectangle((0, 2*i+1), width=9, height=2, facecolor=color, edgecolor='grey')
    ax.add_patch(rect)
It defines three colors for the tricolor bands and iterates through them, drawing rectangles for each color.

Drawing Ashoka Chakra Circle:
chakra_radius = 0.8
ax.plot(4.5, 4, marker='o', markerfacecolor='#000080', markersize=9.5)
chakra = patches.Circle((4.5, 4), chakra_radius, color='#000080', fill=False, linewidth=7)
ax.add_artist(chakra)
It draws a circle at the center of the plot representing the Ashoka Chakra.

Drawing 24 Spokes in Ashoka Chakra:
for i in range(24):
    angle1 = np.pi * i / 12 - np.pi / 48
    angle2 = np.pi * i / 12 + np.pi / 48
    spoke = patches.Polygon([[4.5, 4], 
                             [4.5 + chakra_radius / 2 * np.cos(angle1), 
                              4 + chakra_radius / 2 * np.sin(angle1)], 
                             [4.5 + chakra_radius * np.cos(np.pi * i / 12), 
                              4 + chakra_radius * np.sin(np.pi * i / 12)], 
                             [4.5 + chakra_radius / 2 * np.cos(angle2), 
                              4 + chakra_radius / 2 * np.sin(angle2)]], 
                            fill=True, closed=True, color='#000080')
    ax.add_patch(spoke)
It uses a loop to draw 24 spokes in the Ashoka Chakra.

Setting Equal Axis and Displaying the Plot:
ax.axis('equal')
plt.show()
It ensures that the aspect ratio of the plot is equal and then displays the plot.

How much do you know about Python Modules and packages?

 a. A function can belong to a module and the module can belong to a

package.

Answer

True

b. A package can contain one or more modules in it.

Answer

True

c. Nested packages are allowed.

Answer

True

d. Contents of sys.path variable cannot be modified.

Answer

False

e. In the statement import a.b.c, c cannot be a function.

Answer

True

f. It is a good idea to use * to import all the functions/classes defined in a

module.

Answer

True

Microsoft Azure Fundamentals AZ-900 Exam Prep Specialization

 


Advance your subject-matter expertise

Learn in-demand skills from university and industry experts

Master a subject or tool with hands-on projects

Develop a deep understanding of key concepts

Earn a career certificate from Microsoft

Join Free: Microsoft Azure Fundamentals AZ-900 Exam Prep Specialization

Specialization - 4 course series

Cloud computing is rapidly expanding into all areas of businesses, creating new and exciting career opportunities. These opportunities cover a broad range of roles, from developers and architects to security professionals and data scientists. This program will give you the fundamental knowledge, skills, and confidence to begin your Microsoft Azure certification journey.

This Microsoft Azure Fundamentals AZ-900 Exam Prep Specialization consists of four courses that will act as a bedrock of fundamental knowledge to prepare you for the AZ-900 certification exam and for a career in the cloud. The content of this program is tightly aligned to the AZ-900 exam objective domains.

This program will provide foundational level knowledge on Microsoft Azure concepts; core Microsoft Azure services; core solutions and management tools; general security and network security; governance, privacy, and compliance features; Microsoft Azure cost management, and service level agreements. Ideal for IT personnel just beginning to work with Microsoft Azure or anyone wanting to learn about it.

This Specialization will prepare you to take the AZ-900: Microsoft Azure Fundamentals exam. Upon completion of the Specialization, you will be offered a discount to the Microsoft Azure Fundamentals Certification Exam to be redeemed at Pearson Vue, Microsoft's proctor exam site. Limited discount vouchers are available on first-come-first-serve basis. Coursera and Microsoft may end the offer at any time. 

Applied Learning Project

Learners will engage in interactive exercises throughout this program that offers opportunities to practice and implement what they are learning. They use the Microsoft Learn Sandbox. This a free environment that allows learners to explore Microsoft Azure and get hands-on with live Microsoft Azure resources and services.

For example, when they learn about creating a SQL database, they will work in a temporary Azure environment called the Sandbox. The beauty about this is that you will be working with real technology but in a controlled environment, which allows you to apply what you learn, and at your own pace.

You will need a Microsoft account to sign into the Sandbox. If you don't have one, you can create one for free. The Learn Sandbox allows free, fixed-time access to a cloud subscription with no credit card required. Learners can safely explore, create, and manage resources without the fear of incurring costs or "breaking production".

Azure Data Lake Storage Gen2 and Data Streaming Solution

 


What you'll learn

How to use Azure Data Lake Storage to make processing Big Data analytical solutions more efficient. 

How to set up a stream analytics job to stream data and manage a running job

How to describe the concepts of event processing and streaming data and how this applies to Azure Stream Analytics 

How to use Advanced Threat Protection to proactively monitor your system and describe the various ways to upload data to Data Lake Storage Gen 2

Join Free: Azure Data Lake Storage Gen2 and Data Streaming Solution

There are 4 modules in this course

In this course, you will see how Azure Data Lake Storage can make processing Big Data analytical solutions more efficient and how easy it is to set up. You will also explore how it fits into common architectures, as well as the different methods of uploading the data to the data store. You will examine the myriad of security features that will ensure your data is secure. Learn the concepts of event processing and streaming data and how this applies to Azure Stream Analytics. You will then set up a stream analytics job to stream data, and learn how to manage and monitor a running job.

This course is part of a Specialization intended for Data engineers and developers who want to demonstrate their expertise in designing and implementing data solutions that use Microsoft Azure data services for anyone interested in preparing for the Exam DP-203: Data Engineering on Microsoft Azure (beta). You will take a practice exam that covers key skills measured by the certification exam.

This is the ninth course in a program of 10 courses to help prepare you to take the exam so that you can have expertise in designing and implementing data solutions that use Microsoft Azure data services. The Data Engineering on Microsoft Azure exam is an opportunity to prove knowledge expertise in integrating, transforming, and consolidating data from various structured and unstructured data systems into structures that are suitable for building analytics solutions that use Microsoft Azure data services. Each course teaches you the concepts and skills that are measured by the exam. 

By the end of this Specialization, you will be ready to take and sign-up for the Exam DP-203: Data Engineering on Microsoft Azure (beta).

Prepare for DP-203: Data Engineering on Microsoft Azure Exam

 


What you'll learn

How to refresh and test your knowledge of the skills mapped to all the main topics covered in the DP-203 exam.

How to demonstrate proficiency in the skills measured in Exam DP-203: Data Engineering on Microsoft Azure

How to outline the key points covered in the Microsoft Data Engineer Associate Specialization

How to describe best practices for preparing for the Exam DP-203: Data Engineering on Microsoft Azure

Join Free: Prepare for DP-203: Data Engineering on Microsoft Azure Exam

There are 3 modules in this course

Microsoft certifications give you a professional advantage by providing globally recognized and industry-endorsed evidence of mastering skills in digital and cloud businesses.​​ In this course, you will prepare to take the DP-203 Microsoft Azure Data Fundamentals certification exam. 

You will refresh your knowledge of how to use various Azure data services and languages to store and produce cleansed and enhanced datasets for analysis. You will test your knowledge in a practice exam​ mapped to all the main topics covered in the DP-203 exam, ensuring you’re well prepared for certification success. 

You will also get a more detailed overview of the Microsoft certification program and where you can go next in your career. You’ll also get tips and tricks, testing strategies, useful resources, and information on how to sign up for the DP-203 proctored exam. By the end of this course, you will be ready to sign-up for and take the DP-203 exam.​

This is the last course in a program of 10 courses to help prepare you to take the exam so that you can have expertise in designing and implementing data solutions that use Microsoft Azure data services. The Data Engineering on Microsoft Azure exam is an opportunity to prove knowledge expertise in integrating, transforming, and consolidating data from various structured and unstructured data systems into structures that are suitable for building analytics solutions that use Microsoft Azure data services. Each course teaches you the concepts and skills that are measured by the exam. 

By the end of this Specialization, you will be ready to take and sign-up for the Exam DP-203: Data Engineering on Microsoft Azure (beta).

Prepare for DP-100: Data Science on Microsoft Azure Exam

 


What you'll learn

Outline the key points covered in the Data Science on Microsoft Azure Exam course

Describe best practices for preparing for the Exam DP-100: Designing and Implementing a Data Science Solution on Azure

Demonstrate proficiency in the skills measured in the DP-100: Designing and Implementing a Data Science Solution on Azure

Join Free: Prepare for DP-100: Data Science on Microsoft Azure Exam

There are 6 modules in this course

Microsoft certifications give you a professional advantage by providing globally recognized and industry-endorsed evidence of mastering skills in digital and cloud businesses.​​ In this course, you will prepare to take the DP-100 Azure Data Scientist Associate certification exam. 

You will refresh your knowledge of how to plan and create a suitable working environment for data science workloads on Azure, run data experiments, and train predictive models. In addition, you will recap on how to manage, optimize, and deploy machine learning models into production.

You will test your knowledge in a practice exam​ mapped to all the main topics covered in the DP-100 exam, ensuring you’re well prepared for certification success.

You will also get a more detailed overview of the Microsoft certification program and where you can go next in your career. You’ll also get tips and tricks, testing strategies, useful resources, and information on how to sign up for the DP-100 proctored exam. By the end of this course, you will be ready to sign-up for and take the DP-100 exam.​

This is the fifth course in a five-course program that prepares you to take the DP-100: Designing and Implementing a Data Science Solution on Azure certification exam.

The certification exam is an opportunity to prove knowledge and expertise operate machine learning solutions at a cloud-scale using Azure Machine Learning. This specialization teaches you to leverage your existing knowledge of Python and machine learning to manage data ingestion and preparation, model training and deployment, and machine learning solution monitoring in Microsoft Azure. Each course teaches you the concepts and skills that are measured by the exam. 

This Specialization is intended for data scientists with existing knowledge of Python and machine learning frameworks like Scikit-Learn, PyTorch, and Tensorflow, who want to build and operate machine learning solutions in the cloud. It teaches data scientists how to create end-to-end solutions in Microsoft Azure. Students will learn how to manage Azure resources for machine learning; run experiments and train models; deploy and operationalize machine learning solutions, and implement responsible machine learning. They will also learn to use Azure Databricks to explore, prepare, and model data; and integrate Databricks machine learning processes with Azure Machine Learning.

Microsoft Azure Databricks for Data Engineering

 


What you'll learn

How to work with large amounts of data from multiple sources in different raw formats

How to create production workloads on Azure Databricks with Azure Data Factory

How to build and query a Delta Lake 

How to perform data transformations in DataFrame. How to understand the architecture of an Azure Databricks Spark Cluster and Spark Jobs 

Join Free: Microsoft Azure Databricks for Data Engineering

There are 9 modules in this course

In this course, you will learn how to harness the power of Apache Spark and powerful clusters running on the Azure Databricks platform to run large data engineering workloads in the cloud.

You will discover the capabilities of Azure Databricks and the Apache Spark notebook for processing huge files. You will come to understand the Azure Databricks platform and identify the types of tasks well-suited for Apache Spark. You will also be introduced to the architecture of an Azure Databricks Spark Cluster and Spark Jobs. You will work with large amounts of data from multiple sources in different raw formats.  you will learn how Azure Databricks supports day-to-day data-handling functions, such as reads, writes, and queries.

This course is part of a Specialization intended for Data engineers and developers who want to demonstrate their expertise in designing and implementing data solutions that use Microsoft Azure data services for anyone interested in preparing for the Exam DP-203: Data Engineering on Microsoft Azure (beta). You will take a practice exam that covers key skills measured by the certification exam.

This is the eighth course in a program of 10 courses to help prepare you to take the exam so that you can have expertise in designing and implementing data solutions that use Microsoft Azure data services. The Data Engineering on Microsoft Azure exam is an opportunity to prove knowledge expertise in integrating, transforming, and consolidating data from various structured and unstructured data systems into structures that are suitable for building analytics solutions that use Microsoft Azure data services. Each course teaches you the concepts and skills that are measured by the exam. 

By the end of this Specialization, you will be ready to take and sign-up for the Exam DP-203: Data Engineering on Microsoft Azure (beta).

Data Integration with Microsoft Azure Data Factory

 


What you'll learn

How to create and manage data pipelines in the cloud 

How to integrate data at scale with Azure Synapse Pipeline and Azure Data Factory

Join Free: Data Integration with Microsoft Azure Data Factory

There are 8 modules in this course

In this course, you will learn how to create and manage data pipelines in the cloud using Azure Data Factory.

This course is part of a Specialization intended for Data engineers and developers who want to demonstrate their expertise in designing and implementing data solutions that use Microsoft Azure data services. It is ideal for anyone interested in preparing for the DP-203: Data Engineering on Microsoft Azure exam (beta). 

This is the third course in a program of 10 courses to help prepare you to take the exam so that you can have expertise in designing and implementing data solutions that use Microsoft Azure data services. The Data Engineering on Microsoft Azure exam is an opportunity to prove knowledge expertise in integrating, transforming, and consolidating data from various structured and unstructured data systems into structures that are suitable for building analytics solutions that use Microsoft Azure data services. Each course teaches you the concepts and skills that are measured by the exam. 

By the end of this Specialization, you will be ready to take and sign-up for the Exam DP-203: Data Engineering on Microsoft Azure (beta).

Data Storage in Microsoft Azure

 


What you'll learn

You will learn the basics of storage management in Azure, how to create a Storage Account, and how to choose the right model for your data.

Design and implement data storage and data security

Design and develop data processing

Monitor and optimize data storage and data processing

Join Free: Data Storage in Microsoft Azure

There are 5 modules in this course

Azure provides a variety of ways to store data: unstructured, archival, relational, and more. In this course, you will learn the basics of storage management in Azure, how to create a Storage Account, and how to choose the right model for the data you want to store in the cloud.

This course part of a Specialization intended for Data engineers and developers who want to demonstrate their expertise in designing and implementing data solutions that use Microsoft Azure data services anyone interested in preparing for the Exam DP-203: Data Engineering on Microsoft Azure (beta). 

This is the second in a program of 10 courses to help prepare you to take the exam so that you can have expertise in designing and implementing data solutions that use Microsoft Azure data services. The Data Engineering on Microsoft Azure exam is an opportunity to prove knowledge expertise in integrating, transforming, and consolidating data from various structured and unstructured data systems into structures that are suitable for building analytics solutions that use Microsoft Azure data services. Each course teaches you the concepts and skills that are measured by the exam. 

By the end of this Specialization, you will be ready to take and sign-up for the Exam DP-203: Data Engineering on Microsoft Azure (beta).

Popular Posts

Categories

100 Python Programs for Beginner (53) AI (34) Android (24) AngularJS (1) Assembly Language (2) aws (17) Azure (7) BI (10) book (4) Books (173) C (77) C# (12) C++ (82) Course (67) Coursera (226) Cybersecurity (24) data management (11) Data Science (128) Data Strucures (8) Deep Learning (20) Django (14) Downloads (3) edx (2) Engineering (14) Excel (13) Factorial (1) Finance (6) flask (3) flutter (1) FPL (17) Google (34) Hadoop (3) HTML&CSS (47) IBM (25) IoT (1) IS (25) Java (93) Leet Code (4) Machine Learning (59) Meta (22) MICHIGAN (5) microsoft (4) Nvidia (3) Pandas (4) PHP (20) Projects (29) Python (932) Python Coding Challenge (364) Python Quiz (25) Python Tips (2) Questions (2) R (70) React (6) Scripting (1) security (3) Selenium Webdriver (3) Software (17) SQL (42) UX Research (1) web application (8) Web development (2) web scraping (2)

Followers

Person climbing a staircase. Learn Data Science from Scratch: online program with 21 courses