Projects:

A.Y 2020-2021


Title:Trash classification using OpenCV


Description:

The trash classification is used to detect and classify the trash by using Neural Network. By using this process we can get more accuracy. Before than that K-Means algorithm is used to classify the trash. But we can get only less accuracy. So, we can move to Neural Network. It gives the better accuracy. For the Identification and Classification we are taking neural network and feature extraction. Basically,it is used at the commercial applications, face identification and object tracking, image retrieval, etc.

Name(s):RAMIDI MANIKANTA REDDY(17H61A05A3),DAGGULA UJWALA(17H61A0577),E AKHILA(17H61A0578)

Title:Spam Detection for Youtube comments


Description:

Spamming is the use of messaging system, to send an unsolicited message. YouTube is one of the biggest sites for the user to get information. The best thing about the YouTube is the user can subscribe the channel, like or dislike the video and also giving opinion on the comment section on that video, and YouTube has attracted to increase the number of users.This attracts the spammers by spamming the comments. The spam comments on YouTube offers limited tools for comment moderation, so that the spam volume is shockingly increasing which leads to the owners to disable the comment section in their video. YouTube spam comment has potential to spread malware through comment fields, which will exploit vulnerabilities in the user’s machines

Name(s):PANNALA PRAVALIKA(17H61A05F7),KATUKURI NAVEEN REDDY(17H61A05E3),KANKANALA SAHITH(17H61A05E2)

Title:JOB RECOMMENDATION SYSTEM


Description:

In the current world with an abundance of different state-of-the-art industries and fields cropping up, assisting in an stream of jobs for motivated and talented professionals, it is not difficult to identify your field and to persevere to get a job in the respective field but lack of information and awareness render the task difficult. This problem is being tackled by Job Recommendation systems. We present a privacy-oriented deferred multi-match recommender system which generates stable pairings while requiring users to provide only a partial ranking of their preferences. Private Job Match explores a series of adaptations of the Gale-Shapley deferred-acceptance algorithm which combine the flexibility of decentralized markets with the intelligence of centralized matching. Using the gathered real-user preference data, we find that the match recommendations are greater by requiring only a partial ranking of the user preferences.

Name(s):CHINTHA SATHYANARAYANA REDDY(17H61A05C7),L TRILOCHANI(17H61A05E6),BANDAM SHIVA KIRAN(17H61A05C5)

Title:Data analysis pipeline to explore demographic information for identifying COPD patients


Description:

Chronic Obstructive Pulmonary Disease (COPD) is a life-threatening lung disease and a major cause of morbidity and mortality worldwide. Monitoring of biomarkers that reflect the disease progression plays a pivotal role for the effective management of COPD. Hence, the accurate examination of respiratory tract fluids like saliva is a promising approach for staging disease and predicting its upcoming exacerbations in a Point-of-Care (PoC) environment. Here we propose a data analysis pipeline to analyze the saliva-metric & demographic data and use it for identification of COPD patients. The pipeline includes two phase data analysis i.e., descriptive and predictive analysis. Descriptive analysis focus on statistically describing and visualizing the dataset. It helps in selecting the model for prediction that is useful for doctors.

Name(s):ISRAR AHMED KHAN(17H61A0517),VAJHA SAI SRI HARIKA KOUNDINYA(17H61A0556),N SRAVAN KUMAR(17H61A0531)

Title:Local Events Recommendation using Data Mining Technique


Description:

A local event (e.g., protest, crime, disaster, sport game) is an unusual activity bursted in a local area and within a specific duration while engaging a considerable number of participants. The discovery of local events is of great importance to various applications, such as crime monitoring, disaster alarming, and activity recommendation. The recent explosive growth in geo-tagged tweet data brings new opportunities to it. That said, how to extract quality local events from geo-tagged tweet streams. We analyze tweets from a given geographical region to determine if an event has occurred. We use the text body, date posted, and time posted for each tweet.Every tweet we use has a geotag associated with it, specifying latitude and longitude. In this particular model we first Obtain and preprocess tweets for a particular area.Then Divide tweets into location buckets. Then define a significant event. .

Name(s):KURREMULA DIVIJA(17H61A0526),TATA KIRAN(17H61A0550),SUTHARI RAHUL(17H61A0548)