#python-data-manipulation
Explore tagged Tumblr posts
Text
Cleaning Dirty Data in Python: Practical Techniques with Pandas
I. Introduction Hey there! So, let’s talk about a really important step in data analysis: data cleaning. It’s basically like tidying up your room before a big party – you want everything to be neat and organized so you can find what you need, right? Now, when it comes to sorting through a bunch of messy data, you’ll be glad to have a tool like Pandas by your side. It’s like the superhero of…

View On WordPress
#categorical-data#data-cleaning#data-duplicates#data-outliers#inconsistent-data#missing-values#pandas-tutorial#python-data-cleaning-tools#python-data-manipulation#python-pandas#text-cleaning
0 notes
Text


difference between replacing FA with 20 between Python (left; with Pillow and binascii) and a hex editor (right, HxD).
#python only manipulates the pixels whereas hex editors fuck the actual image data#further experiments in glitch art
1 note
·
View note
Text
Day-4: Unlocking the Power of Randomization in Python Lists
Python Boot Camp 2023 - Day-4
Randomization and Python List Introduction Randomization is an essential concept in computer programming and data analysis. It involves the process of generating random elements or sequences that have an equal chance of being selected. In Python, randomization is a powerful tool that allows developers to introduce an element of unpredictability and make programs more dynamic. This article…

View On WordPress
#Advantages of Randomization in Programming#Dynamic Python Applications#Enhancing User Experience with Randomization#Generating Random Data in Python#How to Shuffle Lists in Python#Python List Data Structure#Python List Manipulation#Python programming techniques#Random Element Selection in Python#Randomization in Python Lists#Randomized Algorithms in Python#Secure Outcomes with Randomization#Unbiased Outcomes in Python#Understanding Non-Deterministic Behavior#Versatility of Randomization in Python
0 notes
Note
Wait I fucked up, I'ma try again:
Since you know python what's the index for the first 3 items of a list?
Is this supposed to "test" me on zero indexing vs one indexing languages?
All you need to know is that I use both python and R, and R makes me want to punch drywall, in part because its 1-indexing and it makes SO many off-by-one errors if you try to use it for anything more complicated than just parroting the same fucking statistical test and graph generation commands over and over
If you need serious data manipulation, pandas is right fucking there.
37 notes
·
View notes
Text
Hmm, Pyodide is not letting me manipulate the dom through Python and still throwing no errors. I'm trying to use the latest Pyodide but that's not doing anything yet. I'll have to dig further. Worst case I can hand the data over to JS with Python globals but I'd like to try and get this working for a day or so.
For reference even this example is failing:
9 notes
·
View notes
Text
Tools of the Trade for Learning Cybersecurity
I created this post for the Studyblr Masterpost Jam, check out the tag for more cool masterposts from folks in the studyblr community!
Cybersecurity professionals use a lot of different tools to get the job done. There are plenty of fancy and expensive tools that enterprise security teams use, but luckily there are also lots of brilliant people writing free and open-source software. In this post, I'm going to list some popular free tools that you can download right now to practice and learn with.
In my opinion, one of the most important tools you can learn how to use is a virtual machine. If you're not already familiar with Linux, this is a great way to learn. VMs are helpful for separating all your security tools from your everyday OS, isolating potentially malicious files, and just generally experimenting. You'll need to use something like VirtualBox or VMWare Workstation (Workstation Pro is now free for personal use, but they make you jump through hoops to download it).
Below is a list of some popular cybersecurity-focused Linux distributions that come with lots of tools pre-installed:
Kali is a popular distro that comes loaded with tools for penetration testing
REMnux is a distro built for malware analysis
honorable mention for FLARE-VM, which is not a VM on its own, but a set of scripts for setting up a malware analysis workstation & installing tools on a Windows VM.
SANS maintains several different distros that are used in their courses. You'll need to create an account to download them, but they're all free:
Slingshot is built for penetration testing
SIFT Workstation is a distro that comes with lots of tools for digital forensics
These distros can be kind of overwhelming if you don't know how to use most of the pre-installed software yet, so just starting with a regular Linux distribution and installing tools as you want to learn them is another good choice for learning.
Free Software
Wireshark: sniff packets and explore network protocols
Ghidra and the free version of IDA Pro are the top picks for reverse engineering
for digital forensics, check out Eric Zimmerman's tools - there are many different ones for exploring & analyzing different forensic artifacts
pwntools is a super useful Python library for solving binary exploitation CTF challenges
CyberChef is a tool that makes it easy to manipulate data - encryption & decryption, encoding & decoding, formatting, conversions… CyberChef gives you a lot to work with (and there's a web version - no installation required!).
Burp Suite is a handy tool for web security testing that has a free community edition
Metasploit is a popular penetration testing framework, check out Metasploitable if you want a target to practice with
SANS also has a list of free tools that's worth checking out.
Programming Languages
Knowing how to write code isn't a hard requirement for learning cybersecurity, but it's incredibly useful. Any programming language will do, especially since learning one will make it easy to pick up others, but these are some common ones that security folks use:
Python is quick to write, easy to learn, and since it's so popular, there are lots of helpful libraries out there.
PowerShell is useful for automating things in the Windows world. It's built on .NET, so you can practically dip into writing C# if you need a bit more power.
Go is a relatively new language, but it's popular and there are some security tools written in it.
Rust is another new-ish language that's designed for memory safety and it has a wonderful community. There's a bit of a steep learning curve, but learning Rust makes you understand how memory bugs work and I think that's neat.
If you want to get into reverse engineering or malware analysis, you'll want to have a good grasp of C and C++.
Other Tools for Cybersecurity
There are lots of things you'll need that aren't specific to cybersecurity, like:
a good system for taking notes, whether that's pen & paper or software-based. I recommend using something that lets you work in plain text or close to it.
general command line familiarity + basic knowledge of CLI text editors (nano is great, but what if you have to work with a system that only has vi?)
familiarity with git and docker will be helpful
There are countless scripts and programs out there, but the most important thing is understanding what your tools do and how they work. There is no magic "hack this system" or "solve this forensics case" button. Tools are great for speeding up the process, but you have to know what the process is. Definitely take some time to learn how to use them, but don't base your entire understanding of security on code that someone else wrote. That's how you end up as a "script kiddie", and your skills and knowledge will be limited.
Feel free to send me an ask if you have questions about any specific tool or something you found that I haven't listed. I have approximate knowledge of many things, and if I don't have an answer I can at least help point you in the right direction.
#studyblrmasterpostjam#studyblr#masterpost#cybersecurity#late post bc I was busy yesterday oops lol#also this post is nearly a thousand words#apparently I am incapable of being succinct lmao
22 notes
·
View notes
Text
instagram
Learning to code and becoming a data scientist without a background in computer science or mathematics is absolutely possible, but it will require dedication, time, and a structured approach. ✨👌🏻 🖐🏻Here’s a step-by-step guide to help you get started:
1. Start with the Basics:
- Begin by learning the fundamentals of programming. Choose a beginner-friendly programming language like Python, which is widely used in data science.
- Online platforms like Codecademy, Coursera, and Khan Academy offer interactive courses for beginners.
2. Learn Mathematics and Statistics:
- While you don’t need to be a mathematician, a solid understanding of key concepts like algebra, calculus, and statistics is crucial for data science.
- Platforms like Khan Academy and MIT OpenCourseWare provide free resources for learning math.
3. Online Courses and Tutorials:
- Enroll in online data science courses on platforms like Coursera, edX, Udacity, and DataCamp. Look for beginner-level courses that cover data analysis, visualization, and machine learning.
4. Structured Learning Paths:
- Follow structured learning paths offered by online platforms. These paths guide you through various topics in a logical sequence.
5. Practice with Real Data:
- Work on hands-on projects using real-world data. Websites like Kaggle offer datasets and competitions for practicing data analysis and machine learning.
6. Coding Exercises:
- Practice coding regularly to build your skills. Sites like LeetCode and HackerRank offer coding challenges that can help improve your programming proficiency.
7. Learn Data Manipulation and Analysis Libraries:
- Familiarize yourself with Python libraries like NumPy, pandas, and Matplotlib for data manipulation, analysis, and visualization.
For more follow me on instagram.
#studyblr#100 days of productivity#stem academia#women in stem#study space#study motivation#dark academia#classic academia#academic validation#academia#academics#dark acadamia aesthetic#grey academia#light academia#romantic academia#chaotic academia#post grad life#grad student#graduate school#grad school#gradblr#stemblog#stem#stemblr#stem student#engineering college#engineering student#engineering#student life#study
7 notes
·
View notes
Text
Python Libraries to Learn Before Tackling Data Analysis
To tackle data analysis effectively in Python, it's crucial to become familiar with several libraries that streamline the process of data manipulation, exploration, and visualization. Here's a breakdown of the essential libraries:
1. NumPy
- Purpose: Numerical computing.
- Why Learn It: NumPy provides support for large multi-dimensional arrays and matrices, along with a collection of mathematical functions to operate on these arrays efficiently.
- Key Features:
- Fast array processing.
- Mathematical operations on arrays (e.g., sum, mean, standard deviation).
- Linear algebra operations.
2. Pandas
- Purpose: Data manipulation and analysis.
- Why Learn It: Pandas offers data structures like DataFrames, making it easier to handle and analyze structured data.
- Key Features:
- Reading/writing data from CSV, Excel, SQL databases, and more.
- Handling missing data.
- Powerful group-by operations.
- Data filtering and transformation.
3. Matplotlib
- Purpose: Data visualization.
- Why Learn It: Matplotlib is one of the most widely used plotting libraries in Python, allowing for a wide range of static, animated, and interactive plots.
- Key Features:
- Line plots, bar charts, histograms, scatter plots.
- Customizable charts (labels, colors, legends).
- Integration with Pandas for quick plotting.
4. Seaborn
- Purpose: Statistical data visualization.
- Why Learn It: Built on top of Matplotlib, Seaborn simplifies the creation of attractive and informative statistical graphics.
- Key Features:
- High-level interface for drawing attractive statistical graphics.
- Easier to use for complex visualizations like heatmaps, pair plots, etc.
- Visualizations based on categorical data.
5. SciPy
- Purpose: Scientific and technical computing.
- Why Learn It: SciPy builds on NumPy and provides additional functionality for complex mathematical operations and scientific computing.
- Key Features:
- Optimized algorithms for numerical integration, optimization, and more.
- Statistics, signal processing, and linear algebra modules.
6. Scikit-learn
- Purpose: Machine learning and statistical modeling.
- Why Learn It: Scikit-learn provides simple and efficient tools for data mining, analysis, and machine learning.
- Key Features:
- Classification, regression, and clustering algorithms.
- Dimensionality reduction, model selection, and preprocessing utilities.
7. Statsmodels
- Purpose: Statistical analysis.
- Why Learn It: Statsmodels allows users to explore data, estimate statistical models, and perform tests.
- Key Features:
- Linear regression, logistic regression, time series analysis.
- Statistical tests and models for descriptive statistics.
8. Plotly
- Purpose: Interactive data visualization.
- Why Learn It: Plotly allows for the creation of interactive and web-based visualizations, making it ideal for dashboards and presentations.
- Key Features:
- Interactive plots like scatter, line, bar, and 3D plots.
- Easy integration with web frameworks.
- Dashboards and web applications with Dash.
9. TensorFlow/PyTorch (Optional)
- Purpose: Machine learning and deep learning.
- Why Learn It: If your data analysis involves machine learning, these libraries will help in building, training, and deploying deep learning models.
- Key Features:
- Tensor processing and automatic differentiation.
- Building neural networks.
10. Dask (Optional)
- Purpose: Parallel computing for data analysis.
- Why Learn It: Dask enables scalable data manipulation by parallelizing Pandas operations, making it ideal for big datasets.
- Key Features:
- Works with NumPy, Pandas, and Scikit-learn.
- Handles large data and parallel computations easily.
Focusing on NumPy, Pandas, Matplotlib, and Seaborn will set a strong foundation for basic data analysis.
6 notes
·
View notes
Note
Any good python modules I can learn now that I'm familiar with the basics?
Hiya 💗
Yep, here's a bunch you can import them into your program to play around with!
math: Provides mathematical functions and constants.
random: Enables generation of random numbers, choices, and shuffling.
datetime: Offers classes for working with dates and times.
os: Allows interaction with the operating system, such as file and directory manipulation.
sys: Provides access to system-specific parameters and functions.
json: Enables working with JSON (JavaScript Object Notation) data.
csv: Simplifies reading and writing CSV (Comma-Separated Values) files.
re: Provides regular expression matching operations.
requests: Allows making HTTP requests to interact with web servers.
matplotlib: A popular plotting library for creating visualizations.
numpy: Enables numerical computations and working with arrays.
pandas: Provides data structures and analysis tools for data manipulation.
turtle: Allows creating graphics and simple games using turtle graphics.
time: Offers functions for time-related operations.
argparse: Simplifies creating command-line interfaces with argument parsing.
How to actually import to your program?
Just in case you don't know, or those reading who don't know:
Use the 'import' keyword, preferably at the top of the page, and the name of the module you want to import. OPTIONAL: you could add 'as [shortname you want to name it in your program]' at the end to use the shortname instead of the whole module name
Hope this helps, good luck with your Python programming! 🙌🏾
60 notes
·
View notes
Text
DataFrame in Pandas: Guide to Creating Awesome DataFrames
Explore how to create a dataframe in Pandas, including data input methods, customization options, and practical examples.
Data analysis used to be a daunting task, reserved for statisticians and mathematicians. But with the rise of powerful tools like Python and its fantastic library, Pandas, anyone can become a data whiz! Pandas, in particular, shines with its DataFrames, these nifty tables that organize and manipulate data like magic. But where do you start? Fear not, fellow data enthusiast, for this guide will…

View On WordPress
#advanced dataframe features#aggregating data in pandas#create dataframe from dictionary in pandas#create dataframe from list in pandas#create dataframe in pandas#data manipulation in pandas#dataframe indexing#filter dataframe by condition#filter dataframe by multiple conditions#filtering data in pandas#grouping data in pandas#how to make a dataframe in pandas#manipulating data in pandas#merging dataframes#pandas data structures#pandas dataframe tutorial#python dataframe basics#rename columns in pandas dataframe#replace values in pandas dataframe#select columns in pandas dataframe#select rows in pandas dataframe#set column names in pandas dataframe#set row names in pandas dataframe
0 notes
Text
Why Python Will Thrive: Future Trends and Applications
Python has already made a significant impact in the tech world, and its trajectory for the future is even more promising. From its simplicity and versatility to its widespread use in cutting-edge technologies, Python is expected to continue thriving in the coming years. Considering the kind support of Python Course in Chennai Whatever your level of experience or reason for switching from another programming language, learning Python gets much more fun.
Let's explore why Python will remain at the forefront of software development and what trends and applications will contribute to its ongoing dominance.
1. Artificial Intelligence and Machine Learning
Python is already the go-to language for AI and machine learning, and its role in these fields is set to expand further. With powerful libraries such as TensorFlow, PyTorch, and Scikit-learn, Python simplifies the development of machine learning models and artificial intelligence applications. As more industries integrate AI for automation, personalization, and predictive analytics, Python will remain a core language for developing intelligent systems.
2. Data Science and Big Data
Data science is one of the most significant areas where Python has excelled. Libraries like Pandas, NumPy, and Matplotlib make data manipulation and visualization simple and efficient. As companies and organizations continue to generate and analyze vast amounts of data, Python’s ability to process, clean, and visualize big data will only become more critical. Additionally, Python’s compatibility with big data platforms like Hadoop and Apache Spark ensures that it will remain a major player in data-driven decision-making.
3. Web Development
Python’s role in web development is growing thanks to frameworks like Django and Flask, which provide robust, scalable, and secure solutions for building web applications. With the increasing demand for interactive websites and APIs, Python is well-positioned to continue serving as a top language for backend development. Its integration with cloud computing platforms will also fuel its growth in building modern web applications that scale efficiently.
4. Automation and Scripting
Automation is another area where Python excels. Developers use Python to automate tasks ranging from system administration to testing and deployment. With the rise of DevOps practices and the growing demand for workflow automation, Python’s role in streamlining repetitive processes will continue to grow. Businesses across industries will rely on Python to boost productivity, reduce errors, and optimize performance. With the aid of Best Online Training & Placement Programs, which offer comprehensive training and job placement support to anyone looking to develop their talents, it’s easier to learn this tool and advance your career.
5. Cybersecurity and Ethical Hacking
With cyber threats becoming increasingly sophisticated, cybersecurity is a critical concern for businesses worldwide. Python is widely used for penetration testing, vulnerability scanning, and threat detection due to its simplicity and effectiveness. Libraries like Scapy and PyCrypto make Python an excellent choice for ethical hacking and security professionals. As the need for robust cybersecurity measures increases, Python’s role in safeguarding digital assets will continue to thrive.
6. Internet of Things (IoT)
Python’s compatibility with microcontrollers and embedded systems makes it a strong contender in the growing field of IoT. Frameworks like MicroPython and CircuitPython enable developers to build IoT applications efficiently, whether for home automation, smart cities, or industrial systems. As the number of connected devices continues to rise, Python will remain a dominant language for creating scalable and reliable IoT solutions.
7. Cloud Computing and Serverless Architectures
The rise of cloud computing and serverless architectures has created new opportunities for Python. Cloud platforms like AWS, Google Cloud, and Microsoft Azure all support Python, allowing developers to build scalable and cost-efficient applications. With its flexibility and integration capabilities, Python is perfectly suited for developing cloud-based applications, serverless functions, and microservices.
8. Gaming and Virtual Reality
Python has long been used in game development, with libraries such as Pygame offering simple tools to create 2D games. However, as gaming and virtual reality (VR) technologies evolve, Python’s role in developing immersive experiences will grow. The language’s ease of use and integration with game engines will make it a popular choice for building gaming platforms, VR applications, and simulations.
9. Expanding Job Market
As Python’s applications continue to grow, so does the demand for Python developers. From startups to tech giants like Google, Facebook, and Amazon, companies across industries are seeking professionals who are proficient in Python. The increasing adoption of Python in various fields, including data science, AI, cybersecurity, and cloud computing, ensures a thriving job market for Python developers in the future.
10. Constant Evolution and Community Support
Python’s open-source nature means that it’s constantly evolving with new libraries, frameworks, and features. Its vibrant community of developers contributes to its growth and ensures that Python stays relevant to emerging trends and technologies. Whether it’s a new tool for AI or a breakthrough in web development, Python’s community is always working to improve the language and make it more efficient for developers.
Conclusion
Python’s future is bright, with its presence continuing to grow in AI, data science, automation, web development, and beyond. As industries become increasingly data-driven, automated, and connected, Python’s simplicity, versatility, and strong community support make it an ideal choice for developers. Whether you are a beginner looking to start your coding journey or a seasoned professional exploring new career opportunities, learning Python offers long-term benefits in a rapidly evolving tech landscape.
#python course#python training#python#technology#tech#python programming#python online training#python online course#python online classes#python certification
2 notes
·
View notes
Text
Day-1: Demystifying Python Variables: A Comprehensive Guide for Data Management
Python Boot Camp Series 2023.
Python is a powerful and versatile programming language used for a wide range of applications. One of the fundamental concepts in Python, and in programming in general, is working with variables. In this article, we will explore what variables are, how to use them effectively to manage data, and some best practices for their usage. What are Variables in Python? Definition of Variables In…

View On WordPress
#best practices for variables#data management in Python#dynamic typing#Python beginners guide#Python coding tips#Python data manipulation#Python data types#Python programming#Python programming concepts#Python tutorials#Python variable naming rules#Python variables#variable scope#working with variables
0 notes
Text
Unlocking the Power of Data: Essential Skills to Become a Data Scientist
In today's data-driven world, the demand for skilled data scientists is skyrocketing. These professionals are the key to transforming raw information into actionable insights, driving innovation and shaping business strategies. But what exactly does it take to become a data scientist? It's a multidisciplinary field, requiring a unique blend of technical prowess and analytical thinking. Let's break down the essential skills you'll need to embark on this exciting career path.
1. Strong Mathematical and Statistical Foundation:
At the heart of data science lies a deep understanding of mathematics and statistics. You'll need to grasp concepts like:
Linear Algebra and Calculus: Essential for understanding machine learning algorithms and optimizing models.
Probability and Statistics: Crucial for data analysis, hypothesis testing, and drawing meaningful conclusions from data.
2. Programming Proficiency (Python and/or R):
Data scientists are fluent in at least one, if not both, of the dominant programming languages in the field:
Python: Known for its readability and extensive libraries like Pandas, NumPy, Scikit-learn, and TensorFlow, making it ideal for data manipulation, analysis, and machine learning.
R: Specifically designed for statistical computing and graphics, R offers a rich ecosystem of packages for statistical modeling and visualization.
3. Data Wrangling and Preprocessing Skills:
Raw data is rarely clean and ready for analysis. A significant portion of a data scientist's time is spent on:
Data Cleaning: Handling missing values, outliers, and inconsistencies.
Data Transformation: Reshaping, merging, and aggregating data.
Feature Engineering: Creating new features from existing data to improve model performance.
4. Expertise in Databases and SQL:
Data often resides in databases. Proficiency in SQL (Structured Query Language) is essential for:
Extracting Data: Querying and retrieving data from various database systems.
Data Manipulation: Filtering, joining, and aggregating data within databases.
5. Machine Learning Mastery:
Machine learning is a core component of data science, enabling you to build models that learn from data and make predictions or classifications. Key areas include:
Supervised Learning: Regression, classification algorithms.
Unsupervised Learning: Clustering, dimensionality reduction.
Model Selection and Evaluation: Choosing the right algorithms and assessing their performance.
6. Data Visualization and Communication Skills:
Being able to effectively communicate your findings is just as important as the analysis itself. You'll need to:
Visualize Data: Create compelling charts and graphs to explore patterns and insights using libraries like Matplotlib, Seaborn (Python), or ggplot2 (R).
Tell Data Stories: Present your findings in a clear and concise manner that resonates with both technical and non-technical audiences.
7. Critical Thinking and Problem-Solving Abilities:
Data scientists are essentially problem solvers. You need to be able to:
Define Business Problems: Translate business challenges into data science questions.
Develop Analytical Frameworks: Structure your approach to solve complex problems.
Interpret Results: Draw meaningful conclusions and translate them into actionable recommendations.
8. Domain Knowledge (Optional but Highly Beneficial):
Having expertise in the specific industry or domain you're working in can give you a significant advantage. It helps you understand the context of the data and formulate more relevant questions.
9. Curiosity and a Growth Mindset:
The field of data science is constantly evolving. A genuine curiosity and a willingness to learn new technologies and techniques are crucial for long-term success.
10. Strong Communication and Collaboration Skills:
Data scientists often work in teams and need to collaborate effectively with engineers, business stakeholders, and other experts.
Kickstart Your Data Science Journey with Xaltius Academy's Data Science and AI Program:
Acquiring these skills can seem like a daunting task, but structured learning programs can provide a clear and effective path. Xaltius Academy's Data Science and AI Program is designed to equip you with the essential knowledge and practical experience to become a successful data scientist.
Key benefits of the program:
Comprehensive Curriculum: Covers all the core skills mentioned above, from foundational mathematics to advanced machine learning techniques.
Hands-on Projects: Provides practical experience working with real-world datasets and building a strong portfolio.
Expert Instructors: Learn from industry professionals with years of experience in data science and AI.
Career Support: Offers guidance and resources to help you launch your data science career.
Becoming a data scientist is a rewarding journey that blends technical expertise with analytical thinking. By focusing on developing these key skills and leveraging resources like Xaltius Academy's program, you can position yourself for a successful and impactful career in this in-demand field. The power of data is waiting to be unlocked – are you ready to take the challenge?
2 notes
·
View notes
Text
https://www.bipamerica.org/data-scientists-toolkit-top-python-libraries
A Data Scientist's toolkit heavily relies on Python libraries to handle data processing, analysis, and modeling. NumPy is essential for numerical computations and array operations, while Pandas provides powerful tools for data manipulation and analysis. Matplotlib and Seaborn are key for data visualization, enabling the creation of insightful charts and graphs.
5 notes
·
View notes
Text
Fuckin
Semi masturbatory caffeinated ramble reflecting on skills acquired in my PhD
Thinking about how broad and interdisciplinary my project is and the kinda of things I have to be familiar with or an expert in. I get down on myself sometimes for progress but looking at all the shit I've learned.... Without formal classes or a senior grad student or (for the majority of it) no post doc. And a PI who can't help bc she's really a business lady at this point not a professor. Maybe shouldn't be hard on myself?
Like. I did completely different projects in undergrad (biotech/proteins/genetics/regenerative medicine, advanced manufacturing/composite fabrication/CNC/welding/process statistics, translational neuropharma studies of addiction/rodent handling and operant training and behavioral video analysis/neural tissue slicing n staining/hand making neuroelectrodes for implantation, design and fabrication of impedance spectroscopy based electrochemical sensors/automation of sensor fab and use w a micro fluidic flow cell)
Like. Since I've started I've learned:
- how to do multi-step air-free water-free chemical synthesis (with glove box and schlenk line) and purification (extraction, filtration, chromatography) of light sensitive amphiphiles (extra tricky)
- how to get and read NMR even for massive fucking molecules and interpret weird peaks (I can casually see if I've got water or any of my common solvents contaminating the spectra without referencing a table at this point)
-how to fucking take down and set up and fix everything in our chemical synthesis lab (because we moved and all our shit was abused for years) and all the intricate non-unified and sometimes conflicting rules for hazardous chemical storage
- the theory/math and how to actually use the equipment to do optoelectronic/photophysical characterization (e.g. using the UV vis spectrometer and writing python to convert the data files into readable tables and figures, learning theory so I can develop equations to relate photon flux to change in absorbance of an actinometer ((light sensitive molecule with a consistent quantum yield)) then obtain quantum yield of charge transfer in a different molecule but same setup, how to use the fluorimeter and get intensity and quantum yield, how to set up lasers and LEDs, what cuvettes to use, how to get fluorescence lifetimes or take two photon excitation data, how to spin coat wafers n do thin film transistor studies), more theory about how photo induced electron transfer voltage sensors work and the importance of angle of insertion on sensitivity (and how to measure it with polarization microscopy) other voltage sensing dye mechanisms like FRET or electrochromic dyes and why to use intensity vs lifetime vs whatever when interpreting signal readouts and the extrinsic and intrinsic factors affecting that interpretation.
- how to do vesicle fabrication and immobilize for imaging, typical membrane compositions and dynamics (e.g. phase orders depending on cholesterol concentrations, significance of packing parameters to membrane organization), measurimg particle radius with DLS, controlling inner cargo and gradients over a membrane by manipulating the bulk solution, the interplay between non radiative decay and the stiffness of the membrane microenvironment around a fluorophore
- the math and bio behind electrophysiology/advanced neuroscience pertaining to modeling and calculating and quantifying signalling/equivalent RC circuit analysis, what spatiotemporal requirements there are for studying this shit <- though this was through a class, not self taught
- I already had cell culture experience and did some adherent and suspended cultures, some live dead imaging assays, etc, but I've learned new facets like how to go about picking electrically exciteable lines (ease of growing? What media requirements? time to multiply and differentiate? What agent to differentiate? How to induce firing without a patch clamp?) and troubleshooting uptake/optimizing staining and imaging parameters (what media or buffer for growth vs staining vs washing vs imaging? Can it have serum? Can it have calcium and magnesium? What salts, how is it buffered, whats the osmolarity I can get away with? What concentrations work for what # of cells? What dilution factors? Do I need to admix equivolumes of dye solution and cell solution? Do I prep the organic solvent+ dye + aqueous solution with sonication or filtration or vortexing before mixing? Is DMSO or ethanol or DMF a better organic for dispersal or biocompatibility? What's the Ideal incubation time for uptake and viability? How long before I absolutely need to image or the dye gets internalized? If it's retained long, how many days could I image for?) for my tricky aggregation-prone non-diffusive thermodynamically-partitioned dye. Also stuff like what commercially available live imaging dyes can I compare to or complement my visualizations with or use for colocalization studies (other lipophilic membrane dyes that insert in the bilayer with preferences for diff order regions? What about comparison with surface adhering dyes like WGA-iFluor that bind surface sugars, to show that our dye can laterally diffuse to areas blocked by cell-cell contacts?), what fluorescence specific parameters do I need to characterize (photo toxicity/photo bleaching time?)
And then there's other shit I've picked up like. Idk. How to make orders in the particular institute I'm in. Better citation managers and ways to search literature. Recognizing what groups and journals and conferences are major players in the fields I'm touching. Getting comfy presenting my shit.
I need to learn a little more about microscopy (especially FLIM and how to build a polarizer module into the scope we have for polarization microscopy), and a little more about the state of the art for voltage dyes and live-imaging dye characterization but man. I think I'm getting somewhere. I'm starting to know enough to see the end of this project and pick my directions moving forward and argue when my PI is wrong
Gahhhhhh
#i should not drink coffee and then go to the bathroom on my phone or you get sloppy brain dumps like this#blog#stem#academia
3 notes
·
View notes
Text
stream of consciousness about the new animation vs. coding episode, as a python programmer
holy shit, my increasingly exciting reaction as i realized that yellow was writing in PYTHON. i write in python. it's the programming language that i used in school and current use in work.
i was kinda expecting a print("hello world") but that's fine
i think using python to demonstrate coding was a practical choice. it's one of the most commonly used programming languages and it's very human readable.
the episode wasn't able to cram every possible concept in programming, of course, but they got a lot of them!
fun stuff like print() not outputting anything and typecasting between string values and integer values!!
string manipulation
booleans
little things like for-loops and while-loops for iterating over a string or list. and indexing! yay :D
* iterable input :D (the *bomb that got thrown at yellow)
and then they started importing libraries! i've never seen the turtle library but it seems like it draws vectors based on the angle you input into a function
the gun list ran out of "bullets" because it kept removing them from the list gun.pop()
AND THEN THE DATA VISUALIZATION. matplotlib!! numpy!!!! my beloved!!!!!!!! i work in data so this!!!! this!!!!! somehow really validating to me to see my favorite animated web series play with data. i think it's also a nice touch that the blue on the bars appear to be the matplotlib default blue. the plot formatting is accurate too!!!
haven't really used pygame either but making shapes and making them move based on arrow key input makes sense
i recall that yellow isn't the physically strongest, but it's cool to see them move around in space and i'm focusing on how they move and figure out the world.
nuke?!
and back to syntax error and then commenting it out # made it go away
cool nuke text motion graphics too :D (i don't think i make that motion in python, personally)
and then yellow cranks it to 100,000 to make a neural network in pytorch. this gets into nlp (tokenizers and other modeling)
a CLASS? we touch on some object oriented programming here but we just see the __init__ function so not the full concept is demonstrated here.
OH! the "hello world" got broken down into tokens. that's why we see the "hello world" string turn into numbers and then... bits (the 0s and 1s)? the strings are tokenized/turned into values that the model can interpret. it's trying to understand written human language
and then an LSTM?! (long short-term memory)
something something feed-forward neural network
model training (hence the epochs and increasing accuracy)
honestly, the scrolling through the code goes so fast, i had to do a second look through (i'm also not very deeply versed in implementing neural networks but i have learned about them in school)
and all of this to send "hello world" to an AI(?) recreation of the exploded laptop
not too bad for a macbook user lol
i'm just kidding, a major of people used macs in my classes
things i wanna do next since im so hyped
i haven't drawn for the fandom in a long time, but i feel a little motivated to draw my design of yellow again. i don't recall the episode using object oriented programming, but i kinda want to make a very simple example where the code is an initialization of a stick figure object and the instances are each of the color gang.
it wouldn't be full blown AI, but it's just me writing in everyone's personality traits and colors into a function, essentially since each stick figure is an individual program.
#animator vs animation#ava#yellow ava#ava yellow#long post#thank you if you took the time to read lol
5 notes
·
View notes