How I built a REST API for CRUD for TECHPLEMENT | Anu Kumar posted on the topic | LinkedIn (2025)

Anu Kumar

Aspiring Python Developer | Enthusiastic Learner in Web Development | Proficient in Django Framework

  • Report this post

TECHPLEMENT give a task to built a REST API to perform CRUD operation on users...The description was : Develop an API system that allows users to sign up, log in, and manage their profile.Implement functionalities such as user input validation, and storage of user data in a database.➡️ : I used Python for backend➡️ : I used Flask framework for built REST API ➡️ : I used MongoDB for database➡️ : I used Postman API for run the task➡️ : It contains user signup, login, profile view and update their profile➡️ : The task creates more interesting while write the codes..➡️ : This was my first task for a company, so I had very excited and some doubts➡️ : It was the very good experience by the company TECHPLEMENT ➡️ : If you signup without email or password it will shows an error message❌➡️ : If the password must be 6 characters✅➡️ : If the email and password meet the speciation, your id will be stored in database➡️ : If you want to login, you must type the email and password correctly✅➡️ : If they wrong, you will get an error message❌➡️ : You can view your email and password through the internet, but you want to type your email correctly➡️ : If the email is wrong, you will get an error message❌➡️ : You can edit your email and password whenever you want✅➡️ : For that you must type both email and password✅➡️ : If the email was not in the database, you can update, otherwise you will get an error message❌#techplement #techplementteam #techplementinternship #api #python #flask

15

2 Comments

Like Comment

TECHPLEMENT

6mo

  • Report this comment

Congratulations🎉 👏, keep up the good work. We truly appreciate your effort and hard work.

Like Reply

1Reaction 2Reactions

To view or add a comment, sign in

More Relevant Posts

  • Ankur Dasgupta

    Coding Enthusiast, Full Stack Developer

    • Report this post

    I have developed a simple REST API system (deployed on my local machine) that allows users to be greeted with a random quote of the day. It also has features such as allowing a user to:- add a quote of their choice.- edit an existing quote.- delete an existing quote.I begin the video by demonstrating how the Add feature works. You simply type a quote inside the input box and click on the "Add" button. The string is then sent to Back End for processing. Assuming the quote you entered is not a duplicate and is not an empty string (AKA, you didn't type anything.), the string is then added to the database and saved. The "Display list of quotes" button will display all of the existing quotes in the database (including the one you just added).After this I show how the Editing feature works. After clicking on the "Display list of quotes" button, you will notice an "Edit" hyperlink beside each quote. Clicking this hyperlink will take you the Editing area where you can change the quote that was next to the hyperlink, to your liking. After making your changes, simply click on the "Submit" button to finalize and push the changes into the database.Lastly, the Delete feature. This one is the simplest, as you only need to enter the first few letters/words of the quote you want to delete and then press the "Delete quote" button. Since many quotes can start with the same letters/words, it is advised to enter a decent bit of the quote so you don't accidentally delete an additional quote on top of the one you wanted to delete. (Delete feature will remove all quotes that start with the entered string so be careful and be as specific as possible!).I have used to HTML, Python, Flask and MySQL to achieve this. I already had an idea about working with Flask but this small but useful project helped me understand Flask even better.TECHPLEMENT#techplement #techplementteam #techplementinternship

    4

    1 Comment

    Like Comment

    To view or add a comment, sign in

  • Vskills - India's Largest Certification Body

    13,754 followers

    • Report this post

    Exploring the Power of FastAPI 🚀FastAPI is a cutting-edge Python web framework designed for efficient API development. Here are some key highlights:📚 Automatic Documentation✅ Data Validation Made Easy⚙️ Asynchronous Support🔄 Dependency Injection System🔐 Security Features⚡ Performance OptimizationLearn the art for FastAPI with Vskills certification, use coupon code FastAPI20 and get 20% discount.Learn more: https://lnkd.in/gNgX2eTV#vskills #certification #fastapi #api #testing #apitesting #jobs #career #skills #careerdevelopment #skillsdevelopment #skillsgap #skillsassessment #certificate #onlinecourse #softwaretesting #automationtesting

    Certificate in FastAPI vskills.in

    3

    Like Comment

    To view or add a comment, sign in

  • Chaturdhan Chaubey

    Data Engineering intern at @IITB || Data Science || Front-End Development || DSA || Python.

    • Report this post

    Hello Friends👋.I am delighted to share that I completed a Project 1 of Machine learning project at VerveBridge. It is my first step in advanced ML concepts and Python also. In that, I created a Book Recommendation System. In particular, the Scenario book recommendation system is very helpful for those people who read more books. Here is my complete workflow of the System Architecture of Project:Step-1:Basic Workflow of Our System Design--Data Collection--Data Preprocessing--Exploratory Data Analysis--Model building --Save the ModelStep-2: Creating User Interface --Create A flask app(app.py) for a straightforward and adaptable method for developing Python-based web applications and APIs.--Add the Save Model in (app.py) as we created above ipynb file which is saved in .pkl form.--In the same Flask app add an HTML, and CSS file for the UI for the User/Client.Step 3: Execution Process of SystemA)-Open All Source Code in your adaptable IDE.B)-Go to the terminal/or output window And create a virtual environment for the Flask app.C)-Install all Necessary modules which are described in Requirement.txt D)-type A commend in the terminal for running a flask app: python -m flask --app .\app.py runE)-Click on the URL that came in the output.Used Technologies:-Flask-Python FrameworkHTML5,CSSMachine learningPythonGithub:-https://lnkd.in/eqwR_erS#internship,#machinelearning,#vervebridge

    14

    1 Comment

    Like Comment

    To view or add a comment, sign in

  • Karan Babani

    Co-Founder and CEO at Foxit | Computer science student with a focus on artificial intelligence & machine learning

    • Report this post

    Today was an exhilarating and productive day as we dived into MongoDB and its integration with Python. 🌐🖥️🔹 MongoDB Integration: We learned how to set up and connect to a MongoDB database—an essential skill for any developer working with NoSQL databases.🔹 Python Integration: We wrote Python scripts to interact with MongoDB, allowing us to perform CRUD (Create, Read, Update, Delete) operations seamlessly.🔹 Full-Stack Development: We took it a step further by developing a full-stack application. We created a user-friendly form using HTML and CSS, then integrated it with a Flask backend. This form collects user data and stores it directly into our MongoDB database, showcasing a practical application of our learning.🔹 Jazbaa Project: I also made significant progress on my Jazbaa project, creating the login page under the guidance of Vimal Daga. His insights were invaluable in shaping a functional and user-friendly interface.Looking forward to more learning and growth in the coming days! 🚀#day09LinuxWorld Informatics Pvt Ltd#LinuxWorldTraining #MongoDB #Python #FullStackDevelopment #Flask #HTML #CSS #CodingJourney #TechTraining #DeveloperSkills #LearningAndGrowing #linuxworld #vimaldaga #bethecreator #makingindiafutureready #summerinternship

    20

    Like Comment

    To view or add a comment, sign in

  • Durga Vathi Devisetty

    Aspiring Data Analyst| Expertise Python and MYSQL

    • Report this post

    🚀 Exciting News! Just completed my latest project: Replicating the power of regex matching and email verification in a web application using Flask! 🌐💻Here's a quick dive into the journey:1️⃣ Created a new project directory and set up a virtual environment with Flask.2️⃣ Launched a Flask application, defining routes for home and form submission.3️⃣ Crafted a sleek HTML template for user-friendly input of test strings and regexes.4️⃣ Implemented a submission route, leveraging Python's re module for robust regex matching.5️⃣ Witness the magic as results are displayed seamlessly below the input form on the HTML template.6️⃣ Rigorous testing ensures accuracy – users can experiment with various test strings and regex patterns!As a bonus, introduced a feature for email ID validation. 📧✨Super thrilled to share this project! 🎉 Your feedback is invaluable - let's connect and discuss! 🤝#flask #regex #webdevelopment #python #emailverification #aws #internship #innomaticsresearchlabs #techinnovation GitHub Link:https://lnkd.in/gGANGQA7AWS Link :-http://3.26.129.9:5000/Innomatics Research LabsKanav BansalSAXON K SHARamya Bhargavi AnumulaRaghu Ram AduriAtluri Naga BaswanthYour document has finished loading

    20

    Like Comment

    To view or add a comment, sign in

  • Neha Poddar

    MCA student @CUH || BSc Mathematics

    • Report this post

    Hello connections!!I am happy to share on completion ofTask 4 : Create a program that extracts product information, such as names, prices, and ratings,# from an online e-commerce website and stores the data in a structured format like a CSV file using Python Programming SkillCraft Technology as Software Development Intern.SkillCraft Technologyhashtag#Software Development Intern hashtag#Python Programming hashtaghashtag#SkillCraft Technology ↗ ↗ Web scraping is a data extraction method used to exclusively gather data from websites.It is widely used for Data mining or collecting valuable insights from large websites.Web scraping comes in handy for personal use as well.Python contains an amazing library called BeautifulSoup to allow web scraping. We will be using it to scrape product information and save the details in a CSV file.➡ Steps involved Step 1 : Initializing the program.Step 2:Retrieving element Ids.Step 3:Saving current information to a text fileDoing the above 2 steps with all of the attributes we wish to capture from weblike Item price, availability etc.Step 4:Closing the file.Step 5:Calling the function we just created.

    2

    1 Comment

    Like Comment

    To view or add a comment, sign in

  • Muhammad Abdullah Saqib

    Junior Python Developer | AI | ML | Meta (Coursera) Certified Social Media Marketer🏅

    • Report this post

    🎉 Excited to share my latest project: 𝐏𝐞𝐫𝐬𝐨𝐧𝐚𝐥 𝐅𝐢𝐧𝐚𝐧𝐜𝐞 𝐓𝐫𝐚𝐜𝐤𝐞𝐫 💲I've just completed building a 𝐏𝐞𝐫𝐬𝐨𝐧𝐚𝐥 𝐅𝐢𝐧𝐚𝐧𝐜𝐞 𝐓𝐫𝐚𝐜𝐤𝐞𝐫 in Python, designed to help users efficiently track their expenses, categorize them, and calculate totals. This project gave me hands-on experience with Python data structures like dictionaries and working with CSV files for storage.📂 I’ve included a sample '𝘦𝘹𝘱𝘦𝘯𝘴𝘦𝘴.𝘤𝘴𝘷' file to demonstrate how the app works—feel free to check out the code and functionality!🔗 𝗖𝗵𝗲𝗰𝗸 𝗼𝘂𝘁 𝘁𝗵𝗲 𝗿𝗲𝗽𝗼𝘀𝗶𝘁𝗼𝗿𝘆 𝗵𝗲𝗿𝗲: https://lnkd.in/d7HXc83kWorking through this project helped me hone my skills in file handling, user input processing, and data management. Every step of the way was a learning experience, and I’m always excited to grow more in this field.📢 𝐼’𝑚 𝑐𝑢𝑟𝑟𝑒𝑛𝑡𝑙𝑦 𝒐𝒑𝒆𝒏 𝒕𝒐 𝒘𝒐𝒓𝒌 𝒂𝒏𝒅 𝒂𝒄𝒕𝒊𝒗𝒆𝒍𝒚 𝒍𝒐𝒐𝒌𝒊𝒏𝒈 𝒇𝒐𝒓 𝒂 𝑷𝒚𝒕𝒉𝒐𝒏 𝒊𝒏𝒕𝒆𝒓𝒏𝒔𝒉𝒊𝒑 𝒕𝒐 𝒇𝒖𝒓𝒕𝒉𝒆𝒓 𝒔𝒉𝒂𝒓𝒑𝒆𝒏 𝒎𝒚 𝒄𝒐𝒅𝒊𝒏𝒈 𝒂𝒃𝒊𝒍𝒊𝒕𝒊𝒆𝒔 𝒂𝒏𝒅 𝒄𝒐𝒏𝒕𝒓𝒊𝒃𝒖𝒕𝒆 𝒕𝒐 𝒎𝒆𝒂𝒏𝒊𝒏𝒈𝒇𝒖𝒍 𝒑𝒓𝒐𝒋𝒆𝒄𝒕𝒔. 𝑳𝒆𝒕’𝒔 𝒄𝒐𝒏𝒏𝒆𝒄𝒕 𝒊𝒇 𝒚𝒐𝒖 𝒉𝒂𝒗𝒆 𝒐𝒑𝒑𝒐𝒓𝒕𝒖𝒏𝒊𝒕𝒊𝒆𝒔 𝒊𝒏 𝑷𝒚𝒕𝒉𝒐𝒏 𝒅𝒆𝒗𝒆𝒍𝒐𝒑𝒎𝒆𝒏𝒕 𝒐𝒓 𝒔𝒊𝒎𝒊𝒍𝒂𝒓 𝒇𝒊𝒆𝒍𝒅𝒔! 💼...#Python #OpenToWork #Internship #FinanceTracker #PythonDevelopment #GitHub #LearningByDoing #DataHandling #CSV #Programming #CodingLife

    GitHub - abdullahsaqib100/personal-finance-tracker github.com
    Like Comment

    To view or add a comment, sign in

  • Yashasvi Mahendraker

    Automation Architect | AWS Cloud Engineer

    • Report this post

    🚀 Today, I've had the privilege of diving into Python and Lambda functions, and the experience has been nothing short of transformative! 💻The ability to leverage Python in Lambda functions opens up a world of possibilities for automation, efficiency, and innovation such as:1. Designing Complex Logic: Python's flexibility allows you to design complex logic within Lambda functions. This can include conditional statements, loops, error handling, and integration with external services or APIs.2. Handling Event Data: Lambda functions often process event data from various sources like S3, DynamoDB, API Gateway, etc. Python’s rich ecosystem of libraries makes it straightforward to parse, manipulate, and respond to this data effectively.3. Optimizing Performance: While Python is known for its simplicity and readability, it’s important to optimize your code for performance, especially in resource-constrained environments like Lambda. Techniques like asynchronous programming, caching, and efficient data processing can help improve execution speed and reduce costs.4. Security Considerations: Always adhere to security best practices when writing Lambda functions. This includes validating input data, encrypting sensitive information, and implementing appropriate IAM policies to restrict access.Thank you, Tejas Mahendrakar, for being an exceptional mentor and for instilling in me the confidence to embrace new challenges and technologies. I'm incredibly grateful for the opportunity to learn from you and excited to continue this journey of growth together! 🌟💼 #LearningFromTheBest #Python #Lambda #Mentorship

    5

    Like Comment

    To view or add a comment, sign in

  • Debarghya Das

    BCA'25 @TIHC || Ex-Intern @Celebal Technologies || Ex-SDE Intern @Prodigy InfoTech || Ex-Intern @Oasis Infobyte

    • Report this post

    I'm excited to share with all my fellow connections that I have completed the #task5 of the Software Development Internship at Prodigy InfoTech.In the fifth task, this is a simple Web Scraping program implemented in Python. Here's a breakdown of what the code does:1. Importing Libraries: - The code begins by importing three Python libraries: requests: Used for making HTTP requests to a specified URL. BeautifulSoup: A library for parsing HTML and XML documents. csv: Used for reading and writing CSV files.2. Function Definition: scrape_web(url): - This function takes a single argument, url, which represents the website to scrape. - It sets custom headers for the HTTP request to mimic a web browser. - The requests.get(url, headers=headers) line fetches the webpage content. - The content is then parsed using BeautifulSoup to create a structured representation of the HTML. - The function iterates through each product item on the webpage (identified by the class “s-result-item”). - For each product, it extracts the product name, price, and rating (if available). - The extracted information is stored in a list of dictionaries called products.3. Function Definition: save_to_csv(products, filename): - This function takes two arguments: products: The list of dictionaries containing product information. filename: The name of the CSV file to save the data. - It opens the specified file in write mode and creates a CSV writer. - The header row (with field names “Name,” “Price,” and “Rating”) is written to the file. - For each product, the relevant data is written to subsequent rows in the CSV file.4. Main Execution: - The code checks if it’s being run as the main program (i.e., not imported as a module). - If so, it prompts the user to input a website URL. - It then calls the scrape_web function with the provided URL. - The scraped product data is saved to a CSV file named “web_scraping.csv.” - Finally, a success message is printed.In summary, this Python script scrapes product information from a specified webpage and saves it to a CSV file. The user provides the URL, and the script extracts product names, prices, and ratings (if available). The resulting data is organized and stored in a CSV file for further analysis or use.GitHub Link: https://lnkd.in/gBVkCmFb#ProdigyInfoTech#internship#SoftwareDevelopment

    17

    Like Comment

    To view or add a comment, sign in

  • Yashvi S.

    Python Developer @ Google Cloud Arcade Facilitator Program | B.tech in Computer Engineering

    • Report this post

    🌟 Exciting Achievement! Completed Task 1: Python Script for Web Scraping and Automation 🌟I'm thrilled to share that I've successfully completed Task 1 with InternCareer, which involved creating a Python script for web scraping and automation. Here's a brief overview of the steps I followed:- Choosing the Right Website: Selected a website with publicly accessible data that piqued my interest. Options considered included news websites, e-commerce platforms, and more.- Using Web Scraping Libraries: Leveraged powerful libraries like Beautiful Soup and Requests to effectively scrape data from the chosen website.- Data Processing: Cleaned and organized the scraped data efficiently. Utilized Pandas to handle various data processing tasks, ensuring the data was in a useful format.- Automation: Automated the script to run at regular intervals using Cron (for Unix systems) or Task Scheduler (for Windows). This ensures the data is always up to date without manual intervention.This task was a fantastic learning experience and a great way to enhance my skills in Python, web scraping, and automation. Looking forward to tackling more challenges and applying these skills to real-world projects!#Python #InterCareer #WebScraping #Automation #DataProcessing #BeautifulSoup #Requests #Pandas #Cron #TaskScheduler #Learning #Coding #Tech

    46

    2 Comments

    Like Comment

    To view or add a comment, sign in

How I built a REST API for CRUD for TECHPLEMENT | Anu Kumar posted on the topic | LinkedIn (29)

How I built a REST API for CRUD for TECHPLEMENT | Anu Kumar posted on the topic | LinkedIn (30)

827 followers

  • 64 Posts

View Profile

Follow

Explore topics

  • Sales
  • Marketing
  • IT Services
  • Business Administration
  • HR Management
  • Engineering
  • Soft Skills
  • See All
How I built a REST API for CRUD for TECHPLEMENT | Anu Kumar posted on the topic | LinkedIn (2025)

References

Top Articles
Latest Posts
Recommended Articles
Article information

Author: Prof. An Powlowski

Last Updated:

Views: 5553

Rating: 4.3 / 5 (44 voted)

Reviews: 83% of readers found this page helpful

Author information

Name: Prof. An Powlowski

Birthday: 1992-09-29

Address: Apt. 994 8891 Orval Hill, Brittnyburgh, AZ 41023-0398

Phone: +26417467956738

Job: District Marketing Strategist

Hobby: Embroidery, Bodybuilding, Motor sports, Amateur radio, Wood carving, Whittling, Air sports

Introduction: My name is Prof. An Powlowski, I am a charming, helpful, attractive, good, graceful, thoughtful, vast person who loves writing and wants to share my knowledge and understanding with you.