Shell-Edunet Skills4Future AICTE Internship

Internship Type: Virtual
Internship Title: Edunet Foundation | Shell | Artificial Intelligence with Green Technology | 4-weeks Virtual Internship
Internship Description:

Dive into the world of Artificial Intelligence with Green Technology and unlock the door to a future filled with innovation and opportunity!

Join the Shell-Edunet Skills4Future AICTE Internship! This is your chance to immerse yourself in hands-on learning of essential technical skills for success. Shell-Edunet Skills4Future AICTE Internship is designed to bridge the employability gap by equipping students with essential technical skills in both Artificial Intelligence (AI) and Green Skills. This certificate-linked program seeks to empower the learners to thrive in the rapidly evolving skill ecosystem, fostering their ability to build successful careers in the dynamic technology sector. Through applying the knowledge of Artificial Intelligence in an efficient way along with the Green Skills to solve the sustainability goals of the society.

Industry experts will mentor throughout the internship. You'll have the opportunity to develop project prototypes to tackle real-world challenges by using your preferred technology track. Work in a student team under your mentor's guidance, you will work in a student team to identify solutions to problems using technology. Selected students will also have the chance to showcase their developed project prototypes at a regional showcase event attended by industry leaders.

About Shell:

Shell is a global energy and petrochemical company operating in over 70 countries, with a workforce of approximately 103,000 employees. The company's goal is to meet current energy demands while fostering sustainability for the future. Leveraging diverse portfolio and talented team, the company drives innovation and facilitates a balanced energy transition. The stakeholders include customers, investors, employees, partners, communities, governments, and regulators. Upholding core values of safety, honesty, integrity, and respect, the company strives to deliver reliable energy solutions while minimizing environmental impact and contributing to social progress.

About Edunet:

Edunet Foundation (EF) was founded in 2015. Edunet promotes youth innovation, tinkering, and helps young people to prepare for industry 4.0 jobs. Edunet has a national footprint of training 300,000+ students. It works with regulators, state technical universities, engineering colleges, and high schools throughout India to enhance the career prospects of the beneficiaries.

Keywords:

AI, Power BI, MI, Data Analytics, Green Skilling, Python Programming, Artificial Intelligence, Computer Vision, Deep Learning, Generative AI, Dashboard Programming, Microsoft Excel, Sustainability

Locations: Pan India
No. of interns: 3000
Amount of stipend per month: ZERO
Qualification: Engineering – 2nd, 3rd & 4th Year Students, Sciences & Polytechnics - 2nd, 3rd Year Students
Specialization:
Engineering - Computer Science, IT, Electronics and Communication, Electrical engineering, Mechatronics, Mechanical engineering, Data Science

Link: https://internship.aicte-india.org

Perks:
  • Personalized mentorship sessions and collaborative group learning.
  • Opportunities to expedite learning through project-based internships.
  • A holistic learning experience provided by industry experts through knowledge-sharing sessions.
  • Showcase your skills by creating prototypes to solve real-world challenges.
  • Earn certifications from AICTE, Edunet and Industry Partners, boosting your confidence and value to potential future employers.
  • Opportunity to present your project prototypes to a panel of industry experts at a regional showcase event.

Terms of Engagement: 4-Weeks (24th February 2025 to 24th March 2025)

Last date to apply: 31st January 2025

Eligibility Criteria:
  • Age: 17+
  • Pursuing degree in computer science, IT, electronics, mechatronics, and related fields.
  • Students must be able to commit required hours for program in addition to regular academics
  • Students must have basic computer operating and programming skills, as relevant
  • Any exposure to programming is preferred but not mandatory
  • Students should have access to computer/laptop with internet connection, either owned OR through institution

Note: The enrolment of students in the 4-weeks Skills4Future virtual internship is subject to the discretion of the team responsible for the operationalization of the Internship at Edunet Foundation.

Indicative timelines for the internship:
Event Timeline
Onset of registration 06-01-2025
Closing applications for internship registrations 31-01-2025
Orientation of Internship 21-02-2025
Commencement of internship 24-02-2025
Offer letter disbursement for internees 26-02-2025
End of internship 24-03-2025
Awarding certificates 02-04-2025

Data Analytics Projects

  1. Exhaustive Analysis of Indian Agriculture Sector Using Power BI
  2. Sustainable Supply Chain Performance Dashboard in Power BI - Power BI

Weekly Completion Tasks

Weekly Completion Tasks

Week 1: Importing, Pre-Processing and Data Modelling  

  • Understanding Data Analytics basics  
  • Understanding Data and application  
  • Understanding Power BI tool  
  • Project Planning (Module identification)  
  • Adding the Data to Power BI  
  • Preparations- Categorization of Data, Data Cleaning operations, Data Wrangling operations, etc.  
  • Identify relations among the data tables and Data Modelling  
  • Business Requirements generation 

Week 1:   

  • Data Analytics basic knowledge understanding level  
  • Power BI Knowledge  
  • Data Importing to Data Operations like Data Cleaning etc.  
  • Create relationships between tables  
  • Understanding the Business / Project Requirements 

 

Submission Details:  

Expected content: Student should create github repository and they should upload their power BI File (.pbix) on Github repository and share link on week1 submission page

File format: GitHub Repository link where your project is uploaded

Projection Submission - on LMS

Skills4future.in Via GitHub link

 

Week 2: DAX and Dash Board(Visualization) 

  • Understanding Data Analysis Expressions for the Project Context 
  • DAX Functions to Project Context 
  • Prepare New Measures and New Columns using DAX Functions according to the Project requirements 
  • Understanding the Various Charts and their usage 
  • Select the appropriate Chart for each Project requirement. 
  • Visualize the text data into Charts 
  • Apply filter(s) on the Chart, if needed.

Week 2:  

  • Prepare the DAX functions for the Project's betterment 
  • Advanced Visualization 
  • User Interaction using filtering, slicers, etc. to make the project interactive. 
  • Create new Columns and new Measures, if needed 
  • Use DAX functions to enhance the Chart 

 

Submission Details:   

Expected content: The student must show the partial output with the help of Power BI Visualization, saving, sharing the projects, etc.  

File format: Repository Link where your partial project is upload

Project Submission Link - On LMS Skills4future.in Via GitHub link

 

Week 3: Visualization and Dashboard Preparation 

  • Power BI Analysis – Advanced Visualization 
  • Filters and Slicers
  • Adding various columns and measures to charts to achieve Project requirements 
  • Testing and Iteration 
  • Formatting 
  • Submit the Project 

 

Week 3:  

  • Prepare Report(s)
  • Use Advanced filtering techniques, if needed 
  • Prepare the Dashboard 
  • Formatting visuals and canvas background. 
  • Approaches of testing strategies. 
  • Cross-checking with functionality 
  • Validation of the Project.  

 

Submission Details:  

Expected content: The students must have prepared the final Dash Board with all the visuals properly formatted and the background formatted with a theme. The students must share the final output test results, and project presentation ppt. The students must share screenshots of the project in the form of an image file.  

File format: .pbix, pdf, PPT  

Project Submission Link - On LMS Skills4future.in Via GitHub link

 

Week 4: 

Mock Presentation and Final Presentations  

 

Week 4: Students should present the project PPT to Experts

 



Advance Machine learning and Artificial Intelligence Project

  1. Plant Disease Detection System for Sustainable Agriculture
  2. Healthcare Prediction on Diabetic Patients using Python.
  3. Air Quality Index Prediction Model with Python.
Data Analytics Course Project Approach

Weekly Completion Tasks

Weekly Completion Tasks

Week 1:

Project Planning and Data Preparation.

  • Define the business problem and set project objectives.
  • Gather relevant datasets and explore potential data sources.
  • Clean and preprocess data by handling missing values, outliers, and encoding.
  • Perform exploratory data analysis (EDA) to understand data patterns.
  • Split data into training, validation, and test sets.

Week 1:   

  • Define the problem and project objectives.
  • Collect and clean the dataset.
  • Perform EDA to understand the data.
  • Split data into training, validation, and test sets

Submission Details:  

Expected content: Student should create github repository and they should upload their jupyter notebook File (.ipynb) on Github repository and share link on week1 submission page

File format: GitHub Repository link where your partial project is uploaded

Project Submission Link - On LMS Skills4future.in Via GitHub link

 

Week 2: Model Selection and Building

  • Research and choose appropriate models for the task.
  • Implement a baseline model and evaluate its performance.
  • Train various machine learning models (e.g., Random Forest, SVM, Deep Learning).
  • Conduct feature engineering to improve model performance.
  • Apply cross-validation for more reliable model evaluation.

Week 2:

  • Research and choose appropriate models.
  • Implement a baseline model and evaluate it.
  • Train different models and tune hyperparameters.
  • Perform feature engineering for improvement.
  • Use cross-validation to check model reliability  

Submission Details:   

Expected content: Expected content: The student must show the partial output with the help of Jupyter Notebook, saving, sharing the project link where it is uploaded on GitHub link

File format: .ipynb file, .py file

Project Submission Link - On LMS Skills4future.in Via GitHub link

 

Week 3: Model Evaluation and Optimization.

  • Evaluate models using metrics like accuracy, precision, recall, or RMSE.
  • Fine-tune models through hyper parameter optimization and regularization.
  • Perform error analysis to address under fitting or over fitting issues.
  • Implement ensemble methods like bagging or boosting if needed.
  • Use model interpretation techniques to explain predictions.
  • Testing and Iteration 
  • Formatting 
  • Submit the Project 

Week 3:

  • Evaluate models using relevant metrics.
  • Tune hyper parameters for better performance.

  • Perform error analysis to refine the model.

  • Implement ensemble techniques for boosting performance.

  • Interpret model output.

  • Submission Details:   

    Expected content: The student must show the output with the help of Jupyter Notebook, saving, sharing the projects, etc. And also create PPT for project.

    File format: .ipynb file, .py file, PPT

    Project Submission Link - On LMS Skills4future.in Via GitHub link

     

    Week 4:  Mock Presentation & Final Presentations 

    Week 4: Students should present the project PPT to Experts

    About the Project

    This project focuses on developing a functional chatbot capable of understanding user inputs by identifying intents and extracting entities. By leveraging NLP techniques and a Logistic Regression model, the chatbot interprets text inputs and provides appropriate responses. A Streamlit-based interface ensures user-friendly interaction, enabling seamless communication with the chatbot. This project serves as a foundational step toward creating advanced conversational agents, with scope for improvement through deeper datasets and sophisticated NLP techniques.

    Learning Objectives

      The objectives of this project are to:

      • Learn how chatbots process user input, recognize intents, and generate responses.
      • Use tokenization and TF-IDF vectorization to preprocess and analyse textual data.
      • Train and evaluate a Logistic Regression model for intent classification.
      • Create and deploy an interactive chatbot interface using the Streamlit framework.

    About the Project

    This project aims to develop a CNN-based plant disease detection model using deep learning and image processing techniques. By analyzing leaf images, the model will identify both healthy and diseased leaves while predicting specific disease types for crops like apple, cherry, grape, and corn. The project involves collecting a diverse dataset, preprocessing images, extracting meaningful features, training and evaluating the CNN model, and optimizing its performance. The ultimate goal is to create a practical tool for early plant disease detection, aiding farmers and agricultural experts in efficient crop management and reducing crop losses.

    Learning Objectives

      The objectives of this project are to:

      • Dataset Acquisition and Preparation:
        • Collect a diverse dataset of leaf images representing various plant species, including healthy and diseased samples.
        • Preprocess the images to ensure uniform size, resolution, and quality.
        • Label the images with corresponding plant species and disease types.
      • Exploratory Data Analysis:
        • Analyze dataset characteristics such as image size, color distribution, and class balance.
      • Feature Extraction:
        • Extract visual features such as leaf texture, color patterns, and vein structures using CNN layers.
        • Experiment with different CNN architectures and transfer learning models to improve feature extraction.
      • Model Development:
        • Train and evaluate deep learning models, focusing on CNN architectures for image classification.
        • Optimize model hyperparameters such as learning rate and number of layers to enhance performance.

    About the Project

    This project aims to develop an advanced Air Quality Index (AQI) prediction model using machine learning techniques. By accurately forecasting AQI values based on real-time data from various pollutants, the model will enable individuals and organizations to take proactive measures to mitigate the harmful effects of air pollution. The project will involve data acquisition, preprocessing, exploratory data analysis, feature engineering, model development, and evaluation. The ultimate goal is to create a reliable and accurate AQI prediction tool that can contribute to public health and environmental protection.

    Learning Objectives

      The objectives of this project are to:

      • Develop a Comprehensive Understanding, Exploring and pre-processing the data:
        • Gather and clean the dataset, ensuring data quality and consistency.
        • Explore the relationships between air quality pollutants and AQI values.
      • Exploratory Data Analysis:
        • Analyze the distribution of air quality pollutants and AQI values.
        • Identify trends, patterns, and correlations within the data.
      • Feature Engineering:
        • Create new features or transform existing ones to improve model performance.
        • Consider factors such as time-series patterns, seasonal variations, and interactions between pollutants.
      • Apply Machine Learning Algorithms:
        • Experiment with various machine learning algorithms (e.g., linear regression, time series models, random forests, neural networks) to find the most suitable approach for AQI prediction.
        • Tune model parameters to optimize performance.
      • Develop Insights and Recommendations:
        • Interpret the model's predictions and identify key factors influencing AQI levels.
        • Provide recommendations for individuals and organizations to reduce their exposure to air pollution and improve air quality.

    Python

    • Introduction to Python: Python, created by Guido van Rossum, is a versatile programming language widely used for web development, data analysis, artificial intelligence, and more.
    • Setting up your Python environment: Choose an Integrated Development Environment (IDE) like Jupyter or VSCode and install libraries using package managers like pip to set up your Python environment efficiently.
    • Data types and variables: Python supports various data types such as numbers, strings, lists, and dictionaries, providing flexibility for diverse programming needs.
    • Operators and expressions: Python offers a range of operators, including arithmetic, comparison, and logical operators, allowing concise expression of complex operations.
    • Conditional statements: Employ conditional statements like if, Elif, and else to execute specific code blocks based on different conditions in your Python programs.
    • Looping constructs: Utilize looping constructs, such as for and while loops, to iterate through data structures or execute a set of instructions repeatedly.
    • Functions: Define functions to encapsulate reusable code, pass arguments, and return values, promoting code modularity and readability in Python.
    • Basic data structures: Python's fundamental data structures, including lists, tuples, and dictionaries, empower efficient storage and manipulation of data in various formats.
    • Data manipulation: Master data manipulation techniques like indexing, slicing, and iterating to extract and transform data effectively in Python.
    • Working with files: Learn file handling in Python for tasks like reading, writing, and processing data from external files.
    • Introduction to modules and libraries: Leverage powerful Python libraries like NumPy for numerical computing and Pandas for data manipulation and analysis to enhance your coding capabilities.
    • Resources:

    Power BI

    What is Power BI (Business Intelligence)?

    Imagine a toolbox that helps you turn a jumble of raw data, from spreadsheets to cloud databases, into clear, visually stunning insights. That's Microsoft Power BI in a nutshell! It's a suite of software and services that lets you connect to various data sources, clean and organize the information, and then bring it to life with interactive charts, graphs, and maps. Think of it as a powerful storyteller for your data, helping you uncover hidden trends, track progress toward goals, and make informed decisions.

    Useful Links for Self-Study:

    Exploratory Data Analysis (EDA)

    • Introduction to EDA: Exploratory Data Analysis (EDA) involves systematically analyzing and visualizing data to discover patterns, anomalies, and insights, playing a crucial role in understanding the underlying structure of the data.
    • Importing and loading Data: Data can be imported into Python using various formats such as CSV, Excel, or SQL, providing a foundation for EDA and subsequent analysis.
    • Data cleaning and preprocessing: Cleaning and preprocessing steps, including handling missing values, outliers, and inconsistencies, are essential for ensuring the accuracy and reliability of the data.
    • Descriptive statistics: Descriptive statistics, encompassing measures of central tendency and dispersion, offer a summary of the main characteristics of the dataset.
    • Data visualization: Visualizations like histograms, boxplots, and scatter plots provide a powerful means to explore data distributions, relationships, and outliers, enhancing the interpretability of the dataset.
    • Identifying patterns and relationships: EDA enables the identification of patterns and relationships within the data, helping to uncover hidden insights and guide subsequent analysis.
    • Univariate and bivariate analysis: Univariate analysis focuses on individual variables, while bivariate analysis explores relationships between pairs of variables, offering a comprehensive understanding of the dataset's structure.
    • Feature engineering: Feature engineering involves creating new features from existing data, and enhancing the dataset with additional information to improve the performance of machine learning models.
    • Hypothesis generation: EDA findings often lead to hypothesis generation, fostering a deeper understanding of the data and guiding further research questions or analytical approaches.
    • Resources:

    Data Visualization

    • Principles of data visualization: Effective data visualizations prioritize clarity, ensuring that the intended message is easily understandable, and accuracy, representing data truthfully and without distortion.
    • Choosing the right chart: Select appropriate chart types, such as bar charts, pie charts, line charts, or maps, based on the nature of your data and the insights you aim to convey.
    • Matplotlib and Seaborn libraries: Matplotlib and Seaborn are powerful Python libraries for creating both simple and advanced visualizations, providing flexibility and customization options.
    • Customizing visuals: Customize visual elements, including colors, labels, axes, and titles, to enhance the overall aesthetics and effectiveness of your data visualizations.
    • Interactive visualizations: Utilize libraries like Plotly and Bokeh to create interactive visualizations, allowing users to engage with and explore data dynamically.
    • Data storytelling: Data storytelling involves using visuals as a narrative tool to communicate insights effectively, making data more accessible and compelling for a broader audience.
    • Best practices for presenting visualizations: When presenting data visualizations, adhere to best practices such as providing context, focusing on key insights, and ensuring clarity to effectively convey the intended message.
    • Resources: