liki281
liki281
thatlaconicgirl
13 posts
Don't wanna be here? Send us removal request.
liki281 · 1 month ago
Text
Fuck you Karthikeya Jayarama. I wish you get hurt the way you hurt me.
0 notes
liki281 · 4 months ago
Text
If a man tunes you out, meets your words with silence, or acts like your voice is background noise, take the hint—he’s not into you, or worse, he doesn’t value you. Never waste your breath on someone who doesn’t respect you.
2 notes · View notes
liki281 · 4 months ago
Text
Journaling: A powerful habit
On this post, I would like to share the benefits of developing a daily habit like journaling. It has been a transformative experience for me so far.
Journaling is a hobby many would have encountered during their school days. It has been a part of many mainstream movies and a lot of other stuff. In short, journaling is not a new activity for people it’s been there since the beginning of mankind precisely when humans began writing.
During my undergraduate years, I was extremely shy and struggled with communication. My difficulty stemmed from an underlying depression that I never acknowledged or worked on. In my home country, discussing depression or mental health is often taboo, so I dealt with it in silence. This took a toll on my thinking process, making me slow to articulate my thoughts and express myself effectively. At times, I found it difficult even to form coherent sentences in conversations.
After graduating, I started my first job, where my poor communication skills affected my interactions with my manager, leading to challenges in the workplace. On top of that, I faced a difficult transition after college—friends moving away, breakups, and other personal struggles left me feeling lost. Frustrated and overwhelmed, I reached a point where I could no longer keep my emotions bottled up. That’s when I turned to writing.
At first, I started small—just one or two sentences a day. But as days passed, I found myself writing more and more. Within a month or two, journaling became a daily habit. Developing any habit is difficult at the beginning, but I told myself not to quit. Over time, writing became an outlet for my thoughts and emotions, providing me with a sense of relief and clarity.
Journaling has transformed the way I express myself. Even on tough days, putting my thoughts into words on paper helps me process my emotions and feel more in control. More importantly, it has improved my ability to communicate with others, strengthened my relationships, and encouraged a more positive outlook on life.
In short, no matter how you feel and where you are in life, journaling can help you transform in ways that you could never have thought before.
1 note · View note
liki281 · 4 months ago
Text
Top 20 GitHub repositories to follow today as a Software Developer
Developer Roadmap - Up to Date roadmap to becoming a developer.
Awesome AI Tools - A curated list of Artificial Intelligence Top Tools.
Awesome - Awesome lists about all kinds of interesting topics
Free Programming Books - A huge list of freely available programming books.
Coding Interview University - A complete computer science study plan to become a software engineer.
Javascript Algorithms - Algorithms and data structures implemented in javascript with explanations and links.
Node Best Practices - The Node JS best Practices List
Tech Interview Handbook - Curated coding interview preparation materials for software engineers.
Project Based Learning - A curated list of project based tutorials.
30 Seconds of code - Short javascript code snippets for all your development needs
Free for dev - A list of Saas, Paas and Laas offerings that have free tiers.
Design resources for developers - A list of resources from stock photos web templates to frameworks, libraries and tools
App Ideas - A collection of application ideas that can be used to improve your coding skills.
Build your own X - Master programming by recreating your favorite technologies from scratch.
Real World - Explore how an identical Medium clone is constructed using various frontends and backends.
Public APIs - A comprehensive list of free APIs for use in software and web development.
System Design Primer - Discover how to design large-scale systems and prepare for the system design interview.
The art of command line - Master the command line, all in one page.
Awesome Repositories - A curated list of GitHub Repositories full of FREE Resources.
The Book of Secret Knowledge - A collection of inspiring lists, manuals, cheatsheets, blogs, hacks, one-liners, cli/web tools, and more.
1 note · View note
liki281 · 5 months ago
Text
What is web scraping and what tools to use for web scraping?
Web scraping is the process of extracting data from websites automatically using software scripts. This technique is widely used for data mining, price monitoring, sentiment analysis, market research and more.
BeautifulSoup
Scrapy
Selenium
Requests & LXML
ScraperAPI
Octoparse
Apify
Data Miner
Web Scraper
Puppeteer
Playwright
1 note · View note
liki281 · 5 months ago
Text
YOLO V/s Embeddings: A comparison between two object detection models
YOLO-Based Detection Model Type: Object detection Method: YOLO is a single-stage object detection model that divides the image into a grid and predicts bounding boxes, class labels, and confidence scores in a single pass. Output: Bounding boxes with class labels and confidence scores. Use Case: Ideal for real-time applications like autonomous vehicles, surveillance, and robotics. Example Models: YOLOv3, YOLOv4, YOLOv5, YOLOv8 Architecture
YOLO processes an image in a single forward pass of a CNN. The image is divided into a grid of cells (e.g., 13×13 for YOLOv3 at 416×416 resolution). Each cell predicts bounding boxes, class labels, and confidence scores. Uses anchor boxes to handle different object sizes. Outputs a tensor of shape [S, S, B*(5+C)] where: S = grid size (e.g., 13×13) B = number of anchor boxes per grid cell C = number of object classes 5 = (x, y, w, h, confidence) Training Process
Loss Function: Combination of localization loss (bounding box regression), confidence loss, and classification loss.
Labels: Requires annotated datasets with labeled bounding boxes (e.g., COCO, Pascal VOC).
Optimization: Typically uses SGD or Adam with a backbone CNN like CSPDarknet (in YOLOv4/v5). Inference Process
Input image is resized (e.g., 416×416). A single forward pass through the model. Non-Maximum Suppression (NMS) filters overlapping bounding boxes. Outputs detected objects with bounding boxes. Strengths
Fast inference due to a single forward pass. Works well for real-time applications (e.g., autonomous driving, security cameras). Good performance on standard object detection datasets. Weaknesses
Struggles with overlapping objects (compared to two-stage models like Faster R-CNN). Fixed number of anchor boxes may not generalize well to all object sizes. Needs retraining for new classes. Embeddings-Based Detection Model Type: Feature-based detection Method: Instead of directly predicting bounding boxes, embeddings-based models generate a high-dimensional representation (embedding vector) for objects or regions in an image. These embeddings are then compared against stored embeddings to identify objects. Output: A similarity score (e.g., cosine similarity) that determines if an object matches a known category. Use Case: Often used for tasks like face recognition (e.g., FaceNet, ArcFace), anomaly detection, object re-identification, and retrieval-based detection where object categories might not be predefined. Example Models: FaceNet, DeepSORT (for object tracking), CLIP (image-text matching) Architecture
Uses a deep feature extraction model (e.g., ResNet, EfficientNet, Vision Transformers). Instead of directly predicting bounding boxes, it generates a high-dimensional feature vector (embedding) for each object or image. The embeddings are stored in a vector database or compared using similarity metrics. Training Process Uses contrastive learning or metric learning. Common loss functions:
Triplet Loss: Forces similar objects to be closer and different objects to be farther in embedding space.
Cosine Similarity Loss: Maximizes similarity between identical objects.
Center Loss: Ensures class centers are well-separated. Training datasets can be either:
Labeled (e.g., with identity labels for face recognition).
Self-supervised (e.g., CLIP uses image-text pairs). Inference Process
Extract embeddings from a new image using a CNN or transformer. Compare embeddings with stored vectors using cosine similarity or Euclidean distance. If similarity is above a threshold, the object is recognized. Strengths
Scalable: New objects can be added without retraining.
Better for recognition tasks: Works well for face recognition, product matching, anomaly detection.
Works without predefined classes (zero-shot learning). Weaknesses
Requires a reference database of embeddings. Not good for real-time object detection (doesn’t predict bounding boxes directly). Can struggle with hard negatives (objects that look similar but are different).
Weaknesses
Struggles with overlapping objects (compared to two-stage models like Faster R-CNN). Fixed number of anchor boxes may not generalize well to all object sizes. Needs retraining for new classes. Embeddings-Based Detection Model Type: Feature-based detection Method: Instead of directly predicting bounding boxes, embeddings-based models generate a high- dimensional representation (embedding vector) for objects or regions in an image. These embeddings are then compared against stored embeddings to identify objects. Output: A similarity score (e.g., cosine similarity) that determines if an object matches a known category. Use Case: Often used for tasks like face recognition (e.g., FaceNet, ArcFace), anomaly detection, object re-identification, and retrieval-based detection where object categories might not be predefined. Example Models: FaceNet, DeepSORT (for object tracking), CLIP (image-text matching) Architecture
Uses a deep feature extraction model (e.g., ResNet, EfficientNet, Vision Transformers). Instead of directly predicting bounding boxes, it generates a high-dimensional feature vector (embedding) for each object or image. The embeddings are stored in a vector database or compared using similarity metrics. Training Process Uses contrastive learning or metric learning. Common loss functions:
Triplet Loss: Forces similar objects to be closer and different objects to be farther in embedding space.
Cosine Similarity Loss: Maximizes similarity between identical objects.
Center Loss: Ensures class centers are well-separated. Training datasets can be either:
Labeled (e.g., with identity labels for face recognition).
Self-supervised (e.g., CLIP uses image-text pairs). Inference Process
Extract embeddings from a new image using a CNN or transformer. Compare embeddings with stored vectors using cosine similarity or Euclidean distance. If similarity is above a threshold, the object is recognized. Strengths
Scalable: New objects can be added without retraining.
Better for recognition tasks: Works well for face recognition, product matching, anomaly detection.
Works without predefined classes (zero-shot learning). Weaknesses
Requires a reference database of embeddings. Not good for real-time object detection (doesn’t predict bounding boxes directly). Can struggle with hard negatives (objects that look similar but are different).
1 note · View note
liki281 · 5 months ago
Text
Top journals for data science and AI
1. Journal of Data Science
2. IEEE journals
3. MIT technology review
4. Harvard data science review
5. Data science journal
1 note · View note
liki281 · 8 months ago
Text
Things I learnt as a young professional 👩‍💻
1. Always meet deadlines
2. If you are not able to take up a task please tell your manager
3. The more you perform the higher the expectations will be from superiors
4. Learn the art of dealing with people smartly be it convincing them properly
5. Don’t ever lose your self esteem after a bad day
6. Always reach out to experienced people if you don’t know the answer
7. Speaking out your problems and opinions make your work easier
8. Don’t put the blame on your peers
Thank you and happy working!!
4 notes · View notes
liki281 · 3 years ago
Text
Tumblr media
1 note · View note
liki281 · 3 years ago
Text
Tumblr media
6 notes · View notes
liki281 · 3 years ago
Text
Tumblr media
1 note · View note
liki281 · 3 years ago
Text
Tumblr media
2 notes · View notes
liki281 · 3 years ago
Text
Tumblr media
2 notes · View notes