#Interpolation Techniques Assignment Help
Explore tagged Tumblr posts
Text
A structured way to learn JavaScript.
I came across a post on Twitter that I thought would be helpful to share with those who are struggling to find a structured way to learn Javascript on their own. Personally, I wish I had access to this information when I first started learning in January. However, I am grateful for my learning journey so far, as I have covered most topics, albeit in a less structured manner.
N/B: Not everyone learns in the same way; it's important to find what works for you. This is a guide, not a rulebook.
EASY
What is JavaScript and its role in web development?
Brief history and evolution of JavaScript.
Basic syntax and structure of JavaScript code.
Understanding variables, constants, and their declaration.
Data types: numbers, strings, boolean, and null/undefined.
Arithmetic, assignment, comparison, and logical operators.
Combining operators to create expressions.
Conditional statements (if, else if, else) for decision making.
Loops (for, while) for repetitive tasks. - Switch statements for multiple conditional cases.
MEDIUM
Defining functions, including parameters and return values.
Function scope, closures, and their practical applications.
Creating and manipulating arrays.
Working with objects, properties, and methods.
Iterating through arrays and objects.Understanding the Document Object Model (DOM).
Selecting and modifying HTML elements with JavaScript.Handling events (click, submit, etc.) with event listeners.
Using try-catch blocks to handle exceptions.
Common error types and debugging techniques.
HARD
Callback functions and their limitations.
Dealing with asynchronous operations, such as AJAX requests.
Promises for handling asynchronous operations.
Async/await for cleaner asynchronous code.
Arrow functions for concise function syntax.
Template literals for flexible string interpolation.
Destructuring for unpacking values from arrays and objects.
Spread/rest operators.
Design Patterns.
Writing unit tests with testing frameworks.
Code optimization techniques.
That's it I guess!
872 notes
·
View notes
Text
Data Matters: How to Curate and Process Information for Your Private LLM
In the era of artificial intelligence, data is the lifeblood of any large language model (LLM). Whether you are building a private LLM for business intelligence, customer service, research, or any other application, the quality and structure of the data you provide significantly influence its accuracy and performance. Unlike publicly trained models, a private LLM requires careful curation and processing of data to ensure relevance, security, and efficiency.
This blog explores the best practices for curating and processing information for your private LLM, from data collection and cleaning to structuring and fine-tuning for optimal results.
Understanding Data Curation
Importance of Data Curation
Data curation involves the selection, organization, and maintenance of data to ensure it is accurate, relevant, and useful. Poorly curated data can lead to biased, irrelevant, or even harmful responses from an LLM. Effective curation helps improve model accuracy, reduce biases, enhance relevance and domain specificity, and strengthen security and compliance with regulations.
Identifying Relevant Data Sources
The first step in data curation is sourcing high-quality information. Depending on your use case, your data sources may include:
Internal Documents: Business reports, customer interactions, support tickets, and proprietary research.
Publicly Available Data: Open-access academic papers, government databases, and reputable news sources.
Structured Databases: Financial records, CRM data, and industry-specific repositories.
Unstructured Data: Emails, social media interactions, transcripts, and chat logs.
Before integrating any dataset, assess its credibility, relevance, and potential biases.
Filtering and Cleaning Data
Once you have identified data sources, the next step is cleaning and preprocessing. Raw data can contain errors, duplicates, and irrelevant information that can degrade model performance. Key cleaning steps include removing duplicates to ensure unique entries, correcting errors such as typos and incorrect formatting, handling missing data through interpolation techniques or removal, and eliminating noise such as spam, ads, and irrelevant content.
Data Structuring for LLM Training
Formatting and Tokenization
Data fed into an LLM should be in a structured format. This includes standardizing text formats by converting different document formats (PDFs, Word files, CSVs) into machine-readable text, tokenization to break down text into smaller units (words, subwords, or characters) for easier processing, and normalization by lowercasing text, removing special characters, and converting numbers and dates into standardized formats.
Labeling and Annotating Data
For supervised fine-tuning, labeled data is crucial. This involves categorizing text with metadata, such as entity recognition (identifying names, locations, dates), sentiment analysis (classifying text as positive, negative, or neutral), topic tagging (assigning categories based on content themes), and intent classification (recognizing user intent in chatbot applications). Annotation tools like Prodigy, Labelbox, or Doccano can facilitate this process.
Structuring Large Datasets
To improve retrieval and model efficiency, data should be stored in a structured format such as vector databases (using embeddings and vector search for fast retrieval like Pinecone, FAISS, Weaviate), relational databases (storing structured data in SQL-based systems), or NoSQL databases (storing semi-structured data like MongoDB, Elasticsearch). Using a hybrid approach can help balance flexibility and speed for different query types.
Processing Data for Model Training
Preprocessing Techniques
Before feeding data into an LLM, preprocessing is essential to ensure consistency and efficiency. This includes data augmentation (expanding datasets using paraphrasing, back-translation, and synthetic data generation), stopword removal (eliminating common but uninformative words like "the," "is"), stemming and lemmatization (reducing words to their base forms like "running" â "run"), and encoding and embedding (transforming text into numerical representations for model ingestion).
Splitting Data for Training
For effective training, data should be split into a training set (80%) used for model learning, a validation set (10%) used for tuning hyperparameters, and a test set (10%) used for final evaluation. Proper splitting ensures that the model generalizes well without overfitting.
Handling Bias and Ethical Considerations
Bias in training data can lead to unfair or inaccurate model predictions. To mitigate bias, ensure diverse data sources that provide a variety of perspectives and demographics, use bias detection tools such as IBM AI Fairness 360, and integrate human-in-the-loop review to manually assess model outputs for biases. Ethical AI principles should guide dataset selection and model training.
Fine-Tuning and Evaluating the Model
Transfer Learning and Fine-Tuning
Rather than training from scratch, private LLMs are often fine-tuned on top of pre-trained models (e.g., GPT, Llama, Mistral). Fine-tuning involves selecting a base model that aligns with your needs, using domain-specific data to specialize the model, and training with hyperparameter optimization by tweaking learning rates, batch sizes, and dropout rates.
Model Evaluation Metrics
Once the model is trained, its performance must be evaluated using metrics such as perplexity (measuring how well the model predicts the next word), BLEU/ROUGE scores (evaluating text generation quality), and human evaluation (assessing outputs for coherence, factual accuracy, and relevance). Continuous iteration and improvement are crucial for maintaining model quality.
Deployment and Maintenance
Deploying the Model
Once the LLM is fine-tuned, deployment considerations include choosing between cloud vs. on-premise hosting depending on data sensitivity, ensuring scalability to handle query loads, and integrating the LLM into applications via REST or GraphQL APIs.
Monitoring and Updating
Ongoing maintenance is necessary to keep the model effective. This includes continuous learning by regularly updating with new data, model drift detection to identify and correct performance degradation, and user feedback integration to use feedback loops to refine responses. A proactive approach to monitoring ensures sustained accuracy and reliability.
Conclusion
Curating and processing information for a private LLM is a meticulous yet rewarding endeavor. By carefully selecting, cleaning, structuring, and fine-tuning data, you can build a robust and efficient AI system tailored to your needs. Whether for business intelligence, customer support, or research, a well-trained private LLM can offer unparalleled insights and automation, transforming the way you interact with data.
Invest in quality data, and your model will yield quality results.
#ai#blockchain#crypto#ai generated#dex#cryptocurrency#blockchain app factory#ico#ido#blockchainappfactory
0 notes
Text
Video Annotation: Transforming Motion into AI-Ready Insights
AI and ML have changed the way people interact with digital content while video annotation is at the focal point of such transformation. It enables everything from detecting pedestrians by autonomous vehicles to real-time checking of threats in surveillance systems, but the interpretation of movement by artificial intelligence is highly dependent on high-quality annotated video data.
In simple terms, it is labeling video frames to add context for AI models. Annotated videos teach AI systems more effectively to identify underlying patterns, behaviors, and environments by marking objects, tracking movement, and defining interactions. Thus, they become smart, fast, and reliable decision-makers.
Why Video Annotation Matters in AI Development
The models that really grow from data are AI, and videos share much richer, dynamic information as compared to static images with no motion. It will provide motion, depth, and continuity, allowing AI to do things like:
Understand movement and object tracking in real-world environments.
Analyze complex interactions between people, objects, and the environment.
Improve decision-making in real-time AI applications.
Enhance accuracy by considering multiple frames rather than isolated snapshots.
These capabilities make video annotation essential for applications ranging from self-driving cars to sports analytics and healthcare diagnostics.
Types of Video Annotation Techniques
The efficiency of any AI model depends on the detail and number of techniques used to annotate the video. Each method caters to a particular type of AI, allowing the model to get exactly the type of data required in its learning process.
Bounding Box Annotation: This method involves drawing rectangular boxes around the objects identified in every frame. Bounding boxes provide an intuitive and effective training method for AI models to detect and recognize objects in the scene.
Semantic Segmentation: Semantic Segmentation is a more complex technique, but it involves the assignment of a category label for every pixel in the frame of a video. With semantic segmentation, the AI reaches a level of understanding of video content that matches that of a human, thus being able to better recognize objects.
Keypoint Annotation: This method usually annotates important points on an object for applications ranging from human pose analysis to face recognition AI. This annotation helps the AI understand body language and gesture by mapping joints and satellite trajectories.
Optical Flow Annotation: The optical flow technique tracks motion in terms of pixel displacement between two adjacent frames. It allows AI models to analyze past trajectories and ascertain the future possibilities of odors.
3D Cuboid Annotation: Unlike bounding boxes, 3D cuboids give depth perception, enabling AI to recognize object dimensions and spatial relationships. This annotation type ensures that AI understands object placement in three-dimensional space.
Challenges in Video Annotation
In spite of their tremendous advantages, video annotation has some annoyances that compromise AI model training farther on.
Huge Volume of Data: Videos contain thousands of frames, making annotation a small-scale arm-bending job. Streamlining the workflow with artificial or AI-assisted annotation could somehow help in this respect.
Motion Blur and Occlusion: Moving objects are magnificently difficult, if not next to impossible, to annotate. This is mainly because they can obscure themselves or become blurred by the movement. Some frame interpolation methods support increased annotation accuracy.
Data Privacy and Compliance: Video content may contain a lot of sensitive data from its very nature. Hence, it is subject to regulations of GDPR, CCPA, and the like. Such proper anonymization methods lead to ethical AI development in practice.
Annotation Consistency: Labeling must be consistent throughout frames and datasets. High variability leads to machine learning bias, so rigorous quality control is necessary.
The Future of Video Annotation in AI
Even as AI improves, techniques will change for video annotation. Various trends in motion analysis of video include:
AI-assisted annotation: Annotating objects with the assistance of AI in an automatic way and less manual effort.
Self-supervised learning: AI learns from unlabeled data off a video.
Real-time annotation: AI easily merges to the live video feed in real-time.
Federated learning: AI models trained over many sources from the edge of the client together, preserving privacy.
These developments will only increase the accuracy of AI, help ensure less training time for AI systems, and support enhanced AI application efficiency.
Conclusion
Video annotation is the backbone of AI-powered motion analysis, as they provide the training data required by machine learning models. With the ability to label objects, movements, and interactions correctly, AI systems can interpret and understand reality with improved accuracy.
From autonomous vehicles and healthcare to sports analytics and retail, video annotation is transforming industries and driving the path to AI innovation for the future. As annotation techniques develop, AI will continue to learn, adapt, and revolutionize the way we interact with visual data.
Visit Globose Technology Solutions to see how the team can speed up your video annotation projects.
0 notes
Text
Pure Mathematics. Topic Numerical Techniques.
Since I have a test on Monday I thought I would use this blog for its intended purpose.
When the normal techniques do not work, we use numerical Techniques to find the root of a function. The normal techniques are factorization, completing the square, and using the quadratic formula x= -b +- square root (b2-4ac )Â /2a.Â
    Firstly, we NEED to establish that there is a root, to do so we use the intermediate value theorem (IMVT). The formal definition of the IMVT is that
IF y=f(x) is a continuous function in the interval a<x<b and if âkâ is a number which lies in between f(a)<k<f(b) or f(b)<k<f(a) then there exists at least one number âcâ such that âcâ lies between a<c<b and f(c)=k.Â
BUT there is no need to commit the definition to memoryÂ
Going back to establishing the root.Â
Eg. Show that x^2 -3x - 1 = 0 has a root in [3,4]Â Â Â Â Â Â
         The [3,4] in this case means interval, 3 < root < 4 or the root lies in between 3 and 4.
         For a root to exist there MUST be a change in sign either (-ve to +ve) or (+ve to -ve).
f(3) = 3^2 - 3(3) -1Â
      = -1
f(4) = 4^2 - 3(4) - 1
      = 3
          What I just did is called substitution. I replaced the value of x with the values in the given interval. Â
    Now we write a STATEMENT, the change in sign indicates that there is a root, and a statement is needed.
Since f(x) is a continuous function in [3,4] and f(3) x f(4) <0 then by IMVT there exits a root such that 3 <root< 4
3< root< 4 This means that â the root lies in between the interval        Â
f(3) x f(4) <0 this means multiplied it produces a negative value which is less than 0Â Â
This needs to be memorised.Â
There are four methods in this topic:
Interval bisection
Liner Interpolation
Newton-Raphson
and
Fixed Pt IterationÂ
In this post, I'll be covering Interval Bisection and Liner Interpolation.
Interval Bisection
First Prove that there is a root
f(x) ; x^2 - 4x + 1 = 0, [3,4]
f(3) = (3)^2 - 4(3) + 1
      = -2
f(4) = 1
Since f(x) is a continuous function in [3,4] and f(3) x f(4) <0 then by IMVT there exists a root such that 3 < root < 4.Â
Now using the interval bisection method
WE FIND THE MIDPOINT
3+4/2 = 7/2 = 3.5
Consider f(3.5) = (3.5)2 - 4(3.5) +1 = 0
f(3.5)= -0.75Â
Now we change the intervalÂ
Since f(3.5) x f(4) <0
Then 3.5 < root < 4 Â or [3.5, 4]
Keep in mind that the root EXISTS where there is a change of signÂ
3.5 +4/2 = 3.75
Consider f(3.75)= 0.0625
Since f(3.5) x f(3.75) < 0
    Then 3.5 < root < 3.75  or [3.5, 3.75]
3.75 + 3.5 /2Â = 3.625
Consider f(3.625) = -0.359
Since f(3.625) x f( 3.75)
Then 3.625 < root < 3.75
After a few more rounds you should getÂ
3.719 < root < 3.734Â Â identical to 1 decimal place
As you can see this method can be lengthy
A question you might have is How do you know when to stop?
Well 2 ways:
When they are identical to the asked decimal places
eg. 3.7 < root < 3.7 which is identical to 1 decimal places
or after how many iterations they ask you for.
eg. Do interval bisection 3 times. With reference to the example we worked out above the answer here would be 3.625 < root < 3.75
 LINEAR INTERPOLATIONÂ
 There are two proofs included in this method that Iâll attach a picture of below. You must memorise these proofs

In this method, we assign the interval value as either a or b. A is normally used for the smaller interval value and b for the larger interval value.Â
We then find f(a) and f(b) and then substitute into any one of the formulas in the above proof. An example of this is the picture below

There was no need to continue since 0.2 < root < 0.2
As stated we only need to put the statement: since f(a) x f(b)< 0, a< root< b if we are doing another interpolation which is why there is an absence of that statement after working out x2.
Again we stop either when the interval is identical to the stated decimal points or after how many iterations.
Hope you found this helpful, I'll post the second part... sometimeđ
but for now I have to revise those other two methods for my test tomorrow.
1 note
·
View note
Text
Image Annotation in Healthcare, Retail, and Autonomous Driving
Introduction:
Image Annotation plays a crucial role in the fields of artificial intelligence (AI) and machine learning (ML), as it involves the careful labeling of images to train models for effective object recognition and scene analysis. This essential process allows AI systems to interpret visual information, thereby supporting various applications such as autonomous driving, medical imaging, and retail analytics.
Categories of Image Annotation:
Bounding Box Annotation: This method entails drawing rectangular boxes around objects in an image, aiding AI models in detecting and classifying different elements. It is commonly utilized in scenarios such as object detection for self-driving cars and surveillance systems.
Semantic Segmentation: In this approach, each pixel in an image is assigned a specific class label, which facilitates a comprehensive understanding of the scene. This technique is vital in medical imaging for differentiating between various tissue types and in autonomous driving for recognizing roadways versus sidewalks.
Polygon Annotation: For objects with non-standard shapes, polygon annotation allows for accurate contour mapping, improving the model's capability to identify complex structures. This technique is particularly beneficial in agricultural technology for recognizing different plant species.
Key Point Annotation: This method involves marking significant points on objects, such as facial features or joint locations, which helps models comprehend object orientation and movement. It is critical in applications like facial recognition and pose estimation.
Cuboid Annotation: Expanding beyond two dimensions, cuboid annotation provides three-dimensional labeling, offering depth information to models. This is essential in robotics and autonomous navigation for understanding spatial relationships.
Industry Applications:
Autonomous Technology: Comprehensive image annotations are vital for self-driving vehicles to accurately perceive and react to their surroundings, thereby ensuring safety and operational efficiency.
Healthcare: In the realm of medical diagnostics, accurate annotations of medical images such as MRIs and CT scans support AI models in identifying various conditions.
GTS's Proficiency in Image Annotation:
At GTS, we excel in delivering extensive image and video annotation services designed to meet the specific requirements of various industries. Our dedicated team meticulously annotates each image, ensuring that even the smallest details are captured to facilitate the development of advanced machine learning models. Utilizing cutting-edge tools and methodologies, we provide data that is not only accurately labeled but also meaningful and actionable.
Our offerings encompass bounding box annotation, semantic segmentation, cuboid annotation, image classification, polygon annotation, key points annotation, lane annotation, custom annotation, and 3D point cloud annotation. We serve a wide range of sectors, including autonomous technology, healthcare, government, retail, finance, and technology.
By collaborating with GTS, organizations can significantly improve the performance of their AI models, resulting in more intelligent and responsive applications across multiple fields.
For further details regarding our image and video annotation services, please visit our website.
Tools and Services for Image Annotation
To address these challenges, various tools and services have been developed to streamline the annotation process:
Computer Vision Annotation Tool (CVAT): An open-source, web-based tool designed for annotating digital images and videos. CVAT supports tasks such as object detection, image classification, and image segmentation, offering features like interpolation of shapes between keyframes and semi-automatic annotation using deep learning models.
Super Annotate: Ranked as a leading data labeling platform, Super Annotate provides an end-to-end data solution with an integrated service marketplace, facilitating efficient and accurate annotation processes.
Anolytics : Offers data annotation services for machine learning, specializing in image annotation to make objects recognizable to computer vision models.
GTS.ai: Provides comprehensive image and video annotation services, employing techniques such as bounding box annotation, semantic segmentation, cuboid annotation, and more to enhance AI algorithms across various industries.
Future of Image Annotation
Advancements in AI are leading towards automated annotation methods, reducing the reliance on manual efforts. Techniques such as automatic image annotation, where systems assign metadata to images without human intervention, are being developed to enhance efficiency and scalability.
Conclusion
Image annotation is a cornerstone of computer vision and AI, transforming raw visual data into structured information that machines can comprehend. As AI continues to evolve, the methods and tools for image annotation will advance Globose Technology Solutions further expanding the potential applications and impact of intelligent systems across various sectors.
0 notes
Video
youtube
Tech talk with Arcoiris Logics #javascript #mobileappdevelopment #codin... Title: Unlocking JavaScript Secrets: Hidden Tech Tips Every Developer Should KnowJavaScript continues to reign as one of the most powerful and widely used programming languages. From creating interactive websites to building complex applications, JavaScript powers the web in ways many developers are yet to fully understand. In this post, weâre diving into some hidden tech tips that can help you improve your JavaScript coding skills and efficiency.Here are some JavaScript tips for developers that will take your development game to the next level:1. Use Destructuring for Cleaner CodeJavaScriptâs destructuring assignment allows you to extract values from arrays or objects into variables in a cleaner and more readable way. Itâs perfect for dealing with complex data structures and improves the clarity of your code.javascriptCopy codeconst user = { name: "John", age: 30, location: "New York" }; const { name, age } = user; console.log(name, age); // John 30 2. Master Arrow FunctionsArrow functions provide a shorter syntax for writing functions and fix some common issues with the this keyword. They are especially useful in callback functions and higher-order functions.javascriptCopy codeconst greet = (name) => `Hello, ${name}!`; console.log(greet("Alice")); // Hello, Alice! 3. Leverage Default ParametersDefault parameters allow you to set default values for function parameters when no value is passed. This feature can help prevent errors and make your code more reliable.javascriptCopy codefunction greet(name = "Guest") { Â return `Hello, ${name}!`; } console.log(greet()); // Hello, Guest! 4. Mastering Promises and Async/AwaitPromises and async/await are essential for handling asynchronous operations in JavaScript. While callbacks were once the go-to solution, promises provide a more manageable way to handle complex asynchronous code.javascriptCopy codeconst fetchData = async () => { Â try { Â Â const response = await fetch("https://api.example.com"); Â Â const data = await response.json(); Â Â console.log(data); Â } catch (error) { Â Â console.error(error); Â } }; fetchData(); 5. Use Template Literals for Dynamic StringsTemplate literals make string interpolation easy and more readable. They also support multi-line strings and expression evaluation directly within the string.javascriptCopy codeconst user = "Alice"; const message = `Hello, ${user}! Welcome to JavaScript tips.`; console.log(message); 6. Avoid Global Variables with IIFE (Immediately Invoked Function Expressions)Global variables can be a source of bugs, especially in large applications. An IIFE helps by creating a local scope for your variables, preventing them from polluting the global scope.javascriptCopy code(function() { Â const temp = "I am local!"; Â console.log(temp); })(); 7. Array Methods You Should KnowJavaScript offers powerful array methods such as map(), filter(), reduce(), and forEach(). These methods allow for more functional programming techniques, making code concise and easier to maintain.javascriptCopy codeconst numbers = [1, 2, 3, 4]; const doubled = numbers.map(num => num * 2); console.log(doubled); // [2, 4, 6, 8] 8. Take Advantage of Spread OperatorThe spread operator (...) can be used to copy elements from arrays or properties from objects. Itâs a game changer when it comes to cloning data or merging multiple arrays and objects.javascriptCopy codeconst arr1 = [1, 2, 3]; const arr2 = [...arr1, 4, 5]; console.log(arr2); // [1, 2, 3, 4, 5] 9. Debugging with console.table()When working with complex data structures, console.table() allows you to output an object or array in a neat table format, making debugging much easier.javascriptCopy codeconst users = [{ name: "John", age: 25 }, { name: "Alice", age: 30 }]; console.table(users); 10. Use Strict Mode for Cleaner CodeActivating strict mode in JavaScript helps eliminate some silent errors by making them throw exceptions. Itâs especially useful when debugging code or working in larger teams.javascriptCopy code"use strict"; let x = 3.14; ConclusionJavaScript offers a wealth of features and hidden gems that, when mastered, can drastically improve your coding skills. By incorporating these tips into your everyday coding practices, you'll write cleaner, more efficient code and stay ahead of the curve in JavaScript development.As technology continues to evolve, staying updated with JavaScript tips and best practices will ensure you're always on top of your development game. Happy coding!#JavaScript #WebDevelopment #TechTips #Coding #DeveloperLife #WebDevTips #JSDevelopment #JavaScriptTips #FrontendDevelopment #ProgrammerLife #JavaScriptForDevelopers #TechCommunity #WebDev
0 notes
Text
Data Labeling Strategies for Cutting-Edge Segmentation ProjectsÂ

Deep learning has been very successful when working with images as data and is currently at a stage where it works better than humans on multiple use-cases. The most important problems that humans have been interested in solving with computer vision are image classification, object detection and segmentation in the increasing order of their difficulty.Â
While there in the plain old task of image classification we are just interested in getting the labels of all the objects that are present in an image. In object detection we come further and try to know along with what all objects that are present in an image, the location at which the objects are present with the help of bounding boxes. Image segmentation takes it to a new level by trying to find out accurately the exact boundary of the objects in the image.Â
What is image segmentation?Â
We know an image is nothing but a collection of pixels. Image segmentation is the process of classifying each pixel in an image belonging to a certain class and hence can be thought of as a classification problem per pixel. There are two types of segmentation techniquesÂ
segmentation: -Â Semantic segmentation is the process of classifying each pixel belonging to a particular label. It doesnât different across different instances of the same object. For example, if there are 2 cats in an image, semantic segmentation gives same label to all the pixels of both catsÂ
Instance segmentation: - Instance segmentation differs from semantic segmentation in the sense that it gives a unique label to every instance of a particular object in the image. As can be seen in the image above all 3 dogs are assigned different colors i.e different labels. With semantic segmentation all of them would have been assigned the same color.Â
There are numerous advances in Segmentation algorithms and open-source datasets. But to solve a particular problem in your domain, you will still need human labeled images or human based verification. In this article, we will go through some of the nuances in segmentation task labeling and how human based workforce can work in tandem with machine learning based approaches.Â
To train your machine learning model, you need high quality labels. For a successful data labeling project for segmentation depends on three key ingredients.Â
Labeling ToolsÂ
TrainingÂ
Quality ManagementÂ
Labeling ToolsÂ
There are many open source and commercially available tools on the market. At objectways, we train our workforce using Open CVAT that provides a polygon tool with interpolation and assistive tooling that gives 4x better speed at labeling and then we use a tool that fits the use case.Â
Here are the leading tools that we recommend for labeling. For efficient labeling, prefer a tool that allows pre-labeling and assistive labeling using techniques like Deep Extreme Cut or Grab cut and good review capabilities such as per label opacity controls.Â
Workforce trainingÂ
While it is easier to train a resource to perform simple image tasks such as classification or bounding boxes, segmentation tasks require more training as it involves multiple mechanisms to optimize time, increase efficiency and reduce worker fatigue. Here are some simple training techniquesÂ
Utilize Assistive Tooling:Â An annotator may start with a simple brush or polygon tool which they find easy to pick up. But at volume, these tools tend to induce muscle fatigue hence it is important to make use of assistive tooling.Â
Gradually introduce complex tasks:Â Annotators are always good at doing the same task more efficiently with time and should be part of the training program. At Objectways, we tend to start training by introducing annotators with simple images with relatively easy shapes (Cars/Buses/Roads) and migrate them to using complex shapes such as vegetation, barriers.Â
Use variety of available open-source pre-labeled datasets: It is also important to train the workforce using different datasets and we use PascalVoc, Coco, Cityscapes, Lits, CCP, Pratheepan, Inria Aerial Image Labeling.Â
Provide Feedback:Â It is also important to provide timely feedback about their work and hence we use the golden set technique that is created by our senior annotators with 99.99% accuracy and use it to provide feedback for annotators during the training.Â
Quality ManagementÂ
In Machine Learning, there are different techniques to understand and evaluate the results of a model.Â
Pixel accuracy:Â Pixel accuracy is the most basic metric which can be used to validate the results. Accuracy is obtained by taking the ratio of correctly classified pixels w.r.t total pixels.Â
Intersection over Union:Â IOU is defined as the ratio of intersection of ground truth and predicted segmentation outputs over their union. If we are calculating for multiple classes, the IOU of each class is calculated, and their meaning is taken. It is a better metric compared to pixel accuracy as if every pixel is given as background in a 2-class input the IOU value is (90/100+0/100)/2 i.e 45% IOU which gives a better representation as compared to 90% accuracy.Â
F1 Score:Â The metric popularly used in classification F1 Score can be used for segmentation tasks as well to deal with class imbalance.Â
If you have a labeled dataset, you can introduce a golden set in the labeling pipeline and use one of the scores to compare labels against your own ground truth. We focus on following aspects to improve quality of labelingÂ
Understand labeling instructions:Â Never underestimate the importance of good labeling instructions. Typically, instructions are authored by data scientists who are good at expressing what they want with examples. The human brain has a natural tendency to give weight to (and remember) negative experiences or interactions more than positive ones â they stand out more. So, it is important to provide bad labeling examples. Reading instructions carefully often weeds out many systemic errors across tasks.Â
Provide timely feedback:Â While many workforces use tiered skilled workforce where level1 workforce are less experienced than quality control team, it is important to provide timely feedback to level1 annotators, so they understand unintentional labeling errors, so they do not make those errors in the future tasksÂ
Rigorous Quality audits:Â Many tools provide nice metrics to track label addition/deletion or change over time. Just as algorithms should converge and reduce the loss function, the time to QC a particular task and suggested changes should converge to less than .01% error rate. At objectways, we have dedicated QC and super QC teams who have a consistent track record to achieve over 99% accuracy.Â
SummaryÂ
We have discussed best practices to manage complex large scale segmentation projects and provided guidance for tooling, workforce upskilling and quality management. Please contact [email protected] to provide feedback or if you have any questions.Â
#Objectways#Artificial Intelligence#Machine Learning#Data Science#Data Labeling#Data Annotation#Human in the Loop
0 notes
Text
MMD FX file reading for shaders: a translation by ryuu
The following tutorial is an English translation of the original one in Japanese by Dance Intervention P.
This English documentation was requested by Chestnutscoop on DeviantArt, as itâll be useful to the MME modding community and help MMD become open-source for updates. Itâs going to be an extensive one, so take it easy.
Disclaimer: coding isnât my area, not even close to my actual career and job (writing/health). I have little idea of whatâs going on here and Iâm relying on my IT friends to help me with this one.
Content Index:
Introduction
Overall Flow
Parameter Declaration
Outline Drawing
Non-Self-shadow Rendering
Drawing Objects When Self-shadow is Disabled
Z-value Plot For Self-shadow Determination
Drawing Objects in Self-shadowing
Final Notes
1. INTRODUCTION
This documentation contains the roots of .fx file reading for MME as well as information on DirectX and programmable shaders while reading full.fx version 1.3. In other words, how to use HLSL for MMD shaders. Everything in this tutorial will try to stay as faithful as possible to the original text in Japanese.
It was translated from Japanese to English by ryuu. As I donât know how to contact Dance Intervention P for permission to translate and publish it here, the original author is free to request me to take it down. The translation was done with the aid of the online translator DeepL and my friendsâ help. This documentation has no intention in replacing the original authorâs.
Any coding line starting with â// [Japanese text]â is the authorâs comments. If the coding isnât properly formatted on Tumblr, you can visit the original document to check it. The original titles of each section were added for ease of use.
2. OVERALL FLOW (ć
šäœăźæ”ă)
Applicable technique â pass â VertexShader â PixelShader
âą Technique: processing of annotations that fall under <>.
âą Pass: processing unit.
âą VertexShader: convert vertices in local coordinates to projective coordinates.
âą PixelShader: sets the color of a vertex.
3. PARAMETER DECLARATION (ăă©ăĄăŒăżćźŁèš)
9 // site-specific transformation matrix
10 float4x4 WorldViewProjMatrix : WORLDVIEWPROJECTION;
11 float4x4 WorldMatrix : WORLD;
12 float4x4 ViewMatrix : VIEW;
13 float4x4 LightWorldViewProjMatrix : WORLDVIEWPROJECTION < string Object = âLightâ; >;
âą Float4x4: 32-bit floating point with 4 rows and 4 columns.
âą WorldViewProjMatrix: a matrix that can transform vertices in local coordinates to projective coordinates with the camera as the viewpoint in a single step.
âą WorldMatrix: a matrix that can transform vertices in local coordinates into world coordinates with the camera as the viewpoint.
âą ViewMatrix: a matrix that can convert world coordinate vertices to view coordinates with the camera as the viewpoint.
âą LightWorldViewProjMatrix: a matrix that can transform vertices in local coordinates to projective coordinates with the light as a viewpoint in a single step.
âą Local coordinate system: coordinates to represent the positional relationship of vertices in the model.
âą World coordinate: coordinates to show the positional relationship between models.
âą View coordinate: coordinates to represent the positional relationship with the camera.
âą Projection Coordinates: coordinates used to represent the depth in the camera. There are two types: perspective projection and orthographic projection.
âą Perspective projection: distant objects are shown smaller and nearby objects are shown larger.
âą Orthographic projection: the size of the image does not change with depth.
15 float3 LightDirection : DIRECTION < string Object = âLightâ; >;
16 float3 CameraPosition : POSITION < string Object = âCameraâ; >;
âą LightDirection: light direction vector.
âą CameraPosition: world coordinates of the camera.
18 // material color
19 float4 MaterialDiffuse : DIFFUSE < string Object = âGeometryâ; >;
20 float3 MaterialAmbient : AMBIENT < string Object = âGeometryâ; >;
21 float3 MaterialEmmisive : EMISSIVE < string Object = âGeometryâ; >;
22 float3 MaterialSpecular : SPECULAR < string Object = âGeometryâ; >;
23 float SpecularPower : SPECULARPOWER < string Object = âGeometryâ; >;
24 float3 MaterialToon : TOONCOLOR;
25 float4 EdgeColor : EDGECOLOR;
âą float3: no alpha value.
âą MaterialDiffuse: diffuse light color of material, Diffuse+A (alpha value) in PMD.
âą MaterialAmbient: ambient light color of the material; Diffuse of PMD?
âą MaterialEmmisive: light emitting color of the material, Ambient in PMD.
âą MaterialSpecular: specular light color of the material; PMDâs Specular.
âą SpecularPower: specular strength. PMD Shininess.
âą MaterialToon: shade toon color of the material, lower left corner of the one specified by the PMD toon texture.
âą EdgeColor: putline color, as specified by MMDâs edge color.
26 // light color
27 float3 LightDiffuse : DIFFUSE < string Object = âLightâ; >;
28 float3 LightAmbient : AMBIENT < string Object = âLightâ; >;
29 float3 LightSpecular : SPECULAR < string Object = âLightâ; >;
30 static float4 DiffuseColor = MaterialDiffuse * float4(LightDiffuse, 1.0f);
31 static float3 AmbientColor = saturate(MaterialAmbient * LightAmbient + MaterialEmmisive);
32 static float3 SpecularColor = MaterialSpecular * LightSpecular;
âą LightDiffuse: black (floa3(0,0,0))ïŒ
âą LightAmbient: MMD lighting operation values.
âą LightSpecular: MMD lighting operation values.
âą DiffuseColor: black by multiplication in LightDiffuse?
âą AmbientColor: does the common color of Diffuse in PMD become a little stronger in the value of lighting manipulation in MMD?
âą SpecularColor: does it feel like PMDâs Specular is a little stronger than MMDâs Lighting Manipulation value?
34 bool parthf; // perspective flags
35 bool transp; // semi-transparent flag
36 bool spadd; // sphere map additive synthesis flag
37 #define SKII1 1500
38 #define SKII2 8000
39 #define Toon 3
âą parthf: true for self-shadow distance setting mode2.
âą transp: true for self-shadow distance setting mode2.
âą spadd: true in sphere file .spa.
âą SKII1:self-shadow A constant used in mode1. The larger the value, the weirder the shadow will be, and the smaller the value, the weaker the shadow will be.
âą SKII2: self-shadow A constant used in mode2. If it is too large, the self-shadow will have a strange shadow, and if it is too small, it will be too thin.
âą Toon: weaken the shade in the direction of the light with a close range shade toon.
41 Â // object textures
42 Â texture ObjectTexture: MATERIALTEXTURE;
43 Â sampler ObjTexSampler = sampler_state {
44 Â Â Â texture = <ObjectTexture>;
45 Â Â Â MINFILTER = LINEAR;
46 Â Â Â MAGFILTER = LINEAR;
47 Â };
48Â
âą ObjectTexture: texture set in the material.
âą ObjTexSampler: setting the conditions for acquiring material textures.
âą MINIFILTER: conditions for shrinking textures.
âą MAGFILTER: conditions for enlarging a texture.
âą LINEAR: interpolate to linear.
49 Â // sphere map textures
50 Â texture ObjectSphereMap: MATERIALSPHEREMAP;
51 Â sampler ObjSphareSampler = sampler_state {
52 Â Â Â texture = <ObjectSphereMap>;
53 Â Â Â MINFILTER = LINEAR;
54 Â Â Â MAGFILTER = LINEAR;
55 Â };
âą ObjectSphereMap: sphere map texture set in the material.
âą ObjSphareSampler: setting the conditions for obtaining a sphere map texture.
57 Â // this is a description to avoid overwriting the original MMD sampler. Cannot be deleted.
58 Â sampler MMDSamp0 : register(s0);
59 Â sampler MMDSamp1 : register(s1);
60 Â sampler MMDSamp2 : register(s2);
âą register: assign shader variables to specific registers.
âą s0: sampler type register 0.
4. OUTLINE DRAWING (èŒȘéæç»)
Model contours used for drawing, no accessories.
65 Â // vertex shader
66 Â float4 ColorRender_VS(float4 Pos : POSITION) : POSITION
67 Â {
68 Â Â Â // world-view projection transformation of camera viewpoint.Â
69 Â Â Â return mul( Pos, WorldViewProjMatrix );
70 Â }
Return the vertex coordinates of the camera viewpoint after the world view projection transformation.
Parameters
âą Pos: local coordinates of the vertex.
âą POSITION (input): semantic indicating the vertex position in the object space.
âą POSITION (output): semantic indicating the position of a vertex in a homogeneous space.
âą mul (x,y): perform matrix multiplication of x and y.
Return value
Vertex coordinates in projective space; compute screen coordinate position by dividing by w.
âą Semantics: communicating information about the intended use of parameters.
72 Â // pixel shader
73 Â float4 ColorRender_PS() : COLOR
74 Â {
75 Â Â Â // fill with outline color
76 Â Â Â return EdgeColor;
77 Â }
Returns the contour color of the corresponding input vertex.
Return value
Output color
âą COLOR: output color semantic.
79 Â // contouring techniques
80 Â technique EdgeTec < string MMDPass = "edge"; > {
81 Â Â Â pass DrawEdge {
82 Â Â Â Â Â AlphaBlendEnable = FALSE;
83      AlphaTestEnable  = FALSE;
84 Â
85 Â Â Â Â Â VertexShader = compile vs_2_0 ColorRender_VS();
86      PixelShader  = compile ps_2_0 ColorRender_PS();
87 Â Â Â }
88 Â }
Processing for contour drawing.
âą MMDPASS: specify the drawing target to apply.
âą âedgeâ: contours of the PMD model.
âą AlphaBlendEnable: set the value to enable alpha blending transparency. Blend surface colors, materials, and textures with transparency information to overlay on another surface.
âą AlphaTestEnable: per-pixel alpha test setting. If passed, the pixel will be processed by the framebuffer. Otherwise, all framebuffer processing of pixels will be skipped.
âą VertexShader: shader variable representing the compiled vertex shader.
âą PixelShader: shader variable representing the compiled pixel shader.
âą vs_2_0: vertex shader profile for shader model 2.
âą ps_2_0:Â pixel shader profile for shader model 2.
âą Frame buffer: memory that holds the data for one frame until it is displayed on the screen.
5. NON-SELF-SHADOW SHADOW RENDERING (éă»ă«ăă·ăŁăăŠćœ±æç»)
Drawing shadows falling on the ground in MMD, switching between showing and hiding them in MMD's ground shadow display.
94 Â // vertex shader
95 Â float4 Shadow_VS(float4 Pos : POSITION) : POSITION
96 Â {
97 Â Â Â // world-view projection transformation of camera viewpoint.Â
98 Â Â Â return mul( Pos, WorldViewProjMatrix );
99 Â }
Returns the vertex coordinates of the source vertex of the shadow display after the world-view projection transformation of the camera viewpoint.
Parameters
âą Pos: local coordinates of the vertex from which the shadow will be displayed.
Return value
Vertex coordinates in projective space.
101 Â // pixel shader
102 Â float4 Shadow_PS() : COLOR
103 Â {
104 Â Â Â // fill with ambient color
105 Â Â Â return float4(AmbientColor.rgb, 0.65f);
106 Â }
Returns the shadow color to be drawn. The alpha value will be reflected when MMD's display shadow color transparency is enabled.
Return value
Output color
108 Â // techniques for shadow drawing
109 Â technique ShadowTec < string MMDPass = "shadow"; > {
110 Â Â Â pass DrawShadow {
111 Â Â Â Â Â VertexShader = compile vs_2_0 Shadow_VS();
112      PixelShader  = compile ps_2_0 Shadow_PS();
113 Â Â Â }
114 Â }
Processing for non-self-shadow shadow drawing.
âą âshadowâ: simple ground shadow.
 6. DRAWING OBJECTS WHEN SELF-SHADOW IS DISABLED (ă»ă«ăă·ăŁăăŠçĄćčæăȘăăžă§ăŻăæç»)
Drawing objects when self-shadowing is disabled. Also used when editing model values.
120 Â struct VS_OUTPUT {
121    float4 Pos     : POSITION;   // projective transformation coordinates
122    float2 Tex     : TEXCOORD1;  // texture
123    float3 Normal   : TEXCOORD2;  // normal vector
124    float3 Eye     : TEXCOORD3;  // position relative to camera
125    float2 SpTex    : TEXCOORD4;    // sphere map texture coordinatesÂ
126    float4 Color    : COLOR0;    // diffuse color
127 Â };
A structure for passing multiple return values between shader stages. The final data to be passed must specify semantics.
Parameters
âą Pos:stores the position of a vertex in projective coordinates as a homogeneous spatial coordinate vertex shader output semantic.
âą Tex: stores the UV coordinates of the vertex as the first texture coordinate vertex shader output semantic.
âą Normal: stores the vertex normal vector as the second texture coordinate vertex shader output semantic.
âą Eye: (opposite?) stores the eye vector as a #3 texture coordinate vertex shader output semantic.
âą SpTex: stores the UV coordinates of the vertex as the number 4 texture coordinate vertex shader output semantic.
âą Color: stores the diffuse light color of a vertex as the 0th color vertex shader output semantic.
129 Â // vertex shader
130 Â VS_OUTPUT Basic_VS(float4 Pos : POSITION, float3 Normal : NORMAL, float2 Tex : TEXCOORD0, uniform bool useTexture, uniform bool useSphereMap, uniform bool useToon)
131 Â {
Converts local coordinates of vertices to projective coordinates. Sets the value to pass to the pixel shader, which returns the VS_OUTPUT structure.
Parameters
âą Pos: local coordinates of the vertex.
âą Normal: normals in local coordinates of vertices.
âą Tex: UV coordinates of the vertices.
âą useTexture: determination of texture usage, given by pass.
âą useSphereMap: determination of sphere map usage, given by pass.
âą useToon: determination of toon usage. Given by pass in the case of model data.
âą uniform: marks variables with data that are always constant during shader execution.
Return value
VS_OUTPUT, a structure passed to the pixel shader.
132 Â Â Â VS_OUTPUT Out = (VS_OUTPUT)0;
133 Â Â Â
Initialize structure members with 0. Error if return member is undefined.
134 Â Â Â // world-view projection transformation of camera viewpoint.
135 Â Â Â Out.Pos = mul( Pos, WorldViewProjMatrix );
136Â
Convert local coordinates of vertices to projective coordinates.
137 Â Â Â // position relative to camera
138 Â Â Â Out.Eye = CameraPosition - mul( Pos, WorldMatrix );
The opposite vector of eye? Calculate.
139 Â Â Â // vertex normal
140 Â Â Â Out.Normal = normalize( mul( Normal, (float3x3)WorldMatrix ) );
141 Â
Compute normalized normal vectors in the vertex world space.
âą normalize (x): normalize a floating-point vector based on x/length(x).
âą length (x): returns the length of a floating-point number vector.
142 Â Â Â // Diffuse color + Ambient color calculation
143 Â Â Â Out.Color.rgb = AmbientColor;
144 Â Â Â if ( !useToon ) {
145 Â Â Â Â Â Out.Color.rgb += max(0,dot( Out.Normal, -LightDirection )) * DiffuseColor.rgb;
By the inner product of the vertex normal and the backward vector of the light, the influence of the light (0-1) is calculated, and the diffuse light color calculated from the influence is added to the ambient light color. DiffuseColor is black because LightDifuse is black, and AmbientColor is the diffuse light of the material. Confirmation required.
âą dot (x,y): return the inner value of the x and y vectors.
âą max (x,y): choose the value of x or y, whichever is greater.
146 Â Â Â }
147 Â Â Â Out.Color.a = DiffuseColor.a;
148 Â Â Â Out.Color = saturate( Out.Color );
149 Â Â Â
âąÂ saturate (x): clamp x to the range 0-1. 0>x, 1>x truncated?
150 Â Â Â // texture coordinates
151 Â Â Â Out.Tex = Tex;
152 Â Â Â
153 Â Â Â if ( useSphereMap ) {
154 Â Â Â Â Â // sphere map texture coordinatesÂ
155 Â Â Â Â Â float2 NormalWV = mul( Out.Normal, (float3x3)ViewMatrix );
X and Y coordinates of vertex normals in view space.
156 Â Â Â Â Â Out.SpTex.x = NormalWV.x * 0.5f + 0.5f;
157 Â Â Â Â Â Out.SpTex.y = NormalWV.y * -0.5f + 0.5f;
158 Â Â Â }
159 Â Â Â
Converts view coordinate values of vertex normals to texture coordinate values. Idiomatic.
160 Â Â Â return Out;
161 Â }
Return the structure you set.
163 Â // pixel shader
164 Â float4 Basic_PS(VS_OUTPUT IN, uniform bool useTexture, uniform bool useSphereMap, uniform bool useToon) : COLOR0
165 Â {
Specify the color of pixels to be displayed on the screen.
Parameters
âą IN: VS_OUTPUT structure received from the vertex shader.
âą useTexture: determination of texture usage, given by pass.
âą useSphereMap: determination of using sphere map, given by pass.
âą useToon: determination of toon usage. Given by pass in the case of model data.
Output value
Output color
166 Â Â Â // specular color calculation
167 Â Â Â float3 HalfVector = normalize( normalize(IN.Eye) + -LightDirection );
Find the half vector from the inverse vector of the line of sight and the inverse vector of the light.
âą Half vector: a vector that is the middle (addition) of two vectors. Used instead of calculating the reflection vector.
168 Â Â Â float3 Specular = pow( max(0,dot( HalfVector, normalize(IN.Normal) )), SpecularPower ) * SpecularColor;
169 Â Â
From the half-vector and vertex normals, find the influence of reflection. Multiply the influence by the specular intensity, and multiply by the specular light color to get the specular.
âą pow (x,y): multiply x by the exponent y.
170 Â Â Â float4 Color = IN.Color;
171 Â Â Â if ( useTexture ) {
172 Â Â Â Â Â // apply texture
173 Â Â Â Â Â Color *= tex2D( ObjTexSampler, IN.Tex );
174 Â Â Â }
If a texture is set, extract the color of the texture coordinates and multiply it by the base color.
âą tex2D (sampler, tex): extract the color of the tex coordinates from the 2D texture in the sampler settings.
175 Â Â Â if ( useSphereMap ) {
176 Â Â Â Â Â // apply sphere map
177 Â Â Â Â Â if(spadd) Color += tex2D(ObjSphareSampler,IN.SpTex);
178      else    Color *= tex2D(ObjSphareSampler,IN.SpTex);
179 Â Â Â }
180 Â Â Â
If a sphere map is set, extract the color of the sphere map texture coordinates and add it to the base color if it is an additive sphere map file, otherwise multiply it.
181 Â Â Â if ( useToon ) {
182 Â Â Â Â Â // toon applicationÂ
183 Â Â Â Â Â float LightNormal = dot( IN.Normal, -LightDirection );
184 Â Â Â Â Â Color.rgb *= lerp(MaterialToon, float3(1,1,1), saturate(LightNormal * 16 + 0.5));
185 Â Â Â }
In the case of the PMD model, determine the influence of the light from the normal vector of the vertex and the inverse vector of the light. Correct the influence level to 0.5-1, and darken the base color for lower influence levels.
âą lerp (x,y,s): linear interpolation based on x + s(y - x). 0=x, 1=y.
186 Â Â Â
187 Â Â Â // specular application
188 Â Â Â Color.rgb += Specular;
189 Â Â Â
190 Â Â Â return Color;
191 Â }
Add the obtained specular to the base color and return the output color.
195 Â technique MainTec0 < string MMDPass = "object"; bool UseTexture = false; bool UseSphereMap = false; bool UseToon = false; > {
196 Â Â Â pass DrawObject {
197 Â Â Â Â Â VertexShader = compile vs_2_0 Basic_VS(false, false, false);
198      PixelShader  = compile ps_2_0 Basic_PS(false, false, false);
199 Â Â Â }
200 Â }
Technique performed on a subset of accessories (materials) that donât use texture or sphere maps when self-shadow is disabled.
âą âobjectâ: object when self-shadow is disabled.
âą UseTexture: true for texture usage subset.
âą UseSphereMap: true for sphere map usage subset.
âą UseToon: true for PMD model.
7. Z-VALUE PLOT FOR SELF-SHADOW DETERMINATION (ă»ă«ăă·ăŁăăŠć€ćźçšZć€ăăăă)
Create a boundary value to be used for determining the self-shadow.
256 Â struct VS_ZValuePlot_OUTPUT {
257 Â Â Â float4 Pos : POSITION; Â Â Â Â Â Â Â // projective transformation coordinatesÂ
258 Â Â Â float4 ShadowMapTex : TEXCOORD0; Â Â // z-buffer texture
259 Â };
A structure for passing multiple return values between shader stages.
Parameters
âą Pos: stores the position of a vertex in projective coordinates as a homogeneous spatial coordinate vertex shader output semantic.
âą ShadowMapTex: stores texture coordinates for hardware calculation of z and w interpolation values as 0 texture coordinate vertex shader output semantics.
âą w: scaling factor of the visual cone (which expands as you go deeper) in projective space.
261 Â // vertex shader
262 Â VS_ZValuePlot_OUTPUT ZValuePlot_VS( float4 Pos : POSITION )
263 Â {
264 Â Â Â VS_ZValuePlot_OUTPUT Out = (VS_ZValuePlot_OUTPUT)0;
265 Â
266 Â Â Â // do a world-view projection transformation with the eyes of the light.Â
267 Â Â Â Out.Pos = mul( Pos, LightWorldViewProjMatrix );
268Â
Conversion of local coordinates of a vertex to projective coordinates with respect to a light.
269 Â Â Â // align texture coordinates to vertices.
270 Â Â Â Out.ShadowMapTex = Out.Pos;
271 Â
272 Â Â Â return Out;
273 Â }
Assign to texture coordinates to let the hardware calculate z, w interpolation values for vertex coordinates, and return the structure.
275 Â // pixel shader
276 Â float4 ZValuePlot_PS( float4 ShadowMapTex : TEXCOORD0 ) : COLOR
277 Â {
278 Â Â Â // record z-values for R color componentsÂ
279 Â Â Â return float4(ShadowMapTex.z/ShadowMapTex.w,0,0,1);
280 Â }
Divide the z-value in projective space by the magnification factor w, calculate the z-value in screen coordinates, assign to r-value and return (internal MMD processing?).
282 Â // techniques for Z-value mapping
283 Â technique ZplotTec < string MMDPass = "zplot"; > {
284 Â Â Â pass ZValuePlot {
285 Â Â Â Â Â AlphaBlendEnable = FALSE;
286 Â Â Â Â Â VertexShader = compile vs_2_0 ZValuePlot_VS();
287      PixelShader  = compile ps_2_0 ZValuePlot_PS();
288 Â Â Â }
289 Â }
Technique to be performed when calculating the z-value for self-shadow determination.
âą âzplotâ: Z-value plot for self-shadow.
8. DRAWING OBJECTS IN SELF-SHADOWING (ă»ă«ăă·ăŁăăŠæăȘăăžă§ăŻăæç»)
Drawing an object with self-shadow.
295 Â // sampler for the shadow buffer. âregister(s0)" because MMD uses s0
296 Â sampler DefSampler : register(s0);
297
Assign sampler register 0 to DefSampler. Not sure when itâs swapped with MMDSamp0 earlier. Not replaceable.
298 Â struct BufferShadow_OUTPUT {
299    float4 Pos    : POSITION;   // projective transformation coordinates
300 Â Â Â float4 ZCalcTex : TEXCOORD0; Â Â // z value
301    float2 Tex    : TEXCOORD1;   // texture
302    float3 Normal  : TEXCOORD2;   // normal vector
303    float3 Eye    : TEXCOORD3;   // position relative to camera
304    float2 SpTex   : TEXCOORD4;     // sphere map texture coordinates
305    float4 Color   : COLOR0;    // diffuse color
306 Â };
VS_OUTPUT with ZCalcTex added.
âą ZCalcTex: stores the texture coordinates for calculating the interpolation values of Z and w for vertices in screen coordinates as the 0 texture coordinate vertex shader output semantic.
308 Â // vertex shader
309 Â BufferShadow_OUTPUT BufferShadow_VS(float4 Pos : POSITION, float3 Normal : NORMAL, float2 Tex : TEXCOORD0, uniform bool useTexture, uniform bool useSphereMap, uniform bool useToon)
310 Â {
Converts local coordinates of vertices to projective coordinates. Set the value to pass to the pixel shader, returning the BufferShadow_OUTPUT structure.
Parameters
âą Pos: local coordinates of the vertex.
âą Normal: normals in local coordinates of vertices.
âą Tex: UV coordinates of the vertices.
âą useTexture: determination of texture usage, given by pass.
âą useSphereMap: determination of sphere map usage, given by pass.
âą useToon: determination of toon usage. Given by pass in the case of model data.
Return value
BufferShadow_OUTPUT.
311 Â Â Â BufferShadow_OUTPUT Out = (BufferShadow_OUTPUT)0;
312
Initializing the structure.
313 Â Â Â // world-view projection transformation of camera viewpoint.
314 Â Â Â Out.Pos = mul( Pos, WorldViewProjMatrix );
315
Convert local coordinates of vertices to projective coordinates.
316 Â Â Â // position relative to camera 317 Â Â Â Out.Eye = CameraPosition - mul( Pos, WorldMatrix );
Calculate the inverse vector of the line of sight.
318 Â Â Â // vertex normal
319 Â Â Â Out.Normal = normalize( mul( Normal, (float3x3)WorldMatrix ) );
Compute normalized normal vectors in the vertex world space.
320 Â Â Â Â Â // world View Projection Transformation with Light Perspective
321 Â Â Â Out.ZCalcTex = mul( Pos, LightWorldViewProjMatrix );
Convert local coordinates of vertices to projective coordinates with respect to the light, and let the hardware calculate z and w interpolation values.
323 Â Â Â // Diffuse color + Ambient color Calculation
324 Â Â Â Out.Color.rgb = AmbientColor;
325 Â Â Â if ( !useToon ) {
326 Â Â Â Â Â Out.Color.rgb += max(0,dot( Out.Normal, -LightDirection )) * DiffuseColor.rgb;
327 Â Â Â }
328 Â Â Â Out.Color.a = DiffuseColor.a;
329 Â Â Â Out.Color = saturate( Out.Color );
Set the base color. For accessories, add a diffuse color to the base color based on the light influence, and set each component to 0-1.
331 Â Â Â // texture coordinates
332 Â Â Â Out.Tex = Tex;
Assign the UV coordinates of the vertex as they are.
334 Â Â Â if ( useSphereMap ) {
335 Â Â Â Â Â // sphere map texture coordinatesÂ
336 Â Â Â Â Â float2 NormalWV = mul( Out.Normal, (float3x3)ViewMatrix );
Convert vertex normal vectors to x and y components in view space coordinates when using sphere maps.
337 Â Â Â Â Â Out.SpTex.x = NormalWV.x * 0.5f + 0.5f;
338 Â Â Â Â Â Out.SpTex.y = NormalWV.y * -0.5f + 0.5f;
339 Â Â Â }
340 Â Â Â
341 Â Â Â return Out;
342 Â }
Convert view space coordinates to texture coordinates and put the structure back.
344 Â // pixel shader
345 Â float4 BufferShadow_PS(BufferShadow_OUTPUT IN, uniform bool useTexture, uniform bool useSphereMap, uniform bool useToon) : COLOR
346 Â {
Specify the color of pixels to be displayed on the screen.
Parameters
âą IN: BufferShadow_OUTPUT structure received from vertex shader.
âą useTexture: determination of texture usage, given by pass.
âą useSphereMap: determination of sphere map usage, given by pass.
âą useToon: determination of toon usage. Given by pass in the case of model data.
Output value
Output color
347 Â Â Â // specular color calculation
348 Â Â Â float3 HalfVector = normalize( normalize(IN.Eye) + -LightDirection );
349 Â Â Â float3 Specular = pow( max(0,dot( HalfVector, normalize(IN.Normal) )), SpecularPower ) * SpecularColor;
350Â
Same specular calculation as Basic_PS.
351 Â Â Â float4 Color = IN.Color;
352 Â Â Â float4 ShadowColor = float4(AmbientColor, Color.a); Â // shadowâs color
Base color and self-shadow base color.
353 Â Â Â if ( useTexture ) {
354 Â Â Â Â Â // apply texture
355 Â Â Â Â Â float4 TexColor = tex2D( ObjTexSampler, IN.Tex );
356 Â Â Â Â Â Color *= TexColor;
357 Â Â Â Â Â ShadowColor *= TexColor;
358 Â Â Â }
When using a texture, extract the color of the texture coordinates from the set texture and multiply it by the base color and self-shadow color respectively.
359 Â Â Â if ( useSphereMap ) {
360 Â Â Â Â Â // apply sphere map
361 Â Â Â Â Â float4 TexColor = tex2D(ObjSphareSampler,IN.SpTex);
362 Â Â Â Â Â if(spadd) {
363 Â Â Â Â Â Â Â Color += TexColor;
364 Â Â Â Â Â Â Â ShadowColor += TexColor;
365 Â Â Â Â Â } else {
366 Â Â Â Â Â Â Â Color *= TexColor;
367 Â Â Â Â Â Â Â ShadowColor *= TexColor;
368 Â Â Â Â Â }
369 Â Â Â }
As with Basic_PS, when using a sphere map, add or multiply the corresponding colors.
370 Â Â Â // specular application
371 Â Â Â Color.rgb += Specular;
372 Â
Apply specular to the base color.
373 Â Â Â // convert to texture coordinates 374 Â Â Â IN.ZCalcTex /= IN.ZCalcTex.w;
Divide the z-value in projective space by the scaling factor w and convert to screen coordinates.
375 Â Â Â float2 TransTexCoord;
376 Â Â Â TransTexCoord.x = (1.0f + IN.ZCalcTex.x)*0.5f;
377 Â Â Â TransTexCoord.y = (1.0f - IN.ZCalcTex.y)*0.5f;
378 Â Â Â
Convert screen coordinates to texture coordinates.
379 Â Â Â if( any( saturate(TransTexCoord) != TransTexCoord ) ) {
380 Â Â Â Â Â // external shadow buffer
381 Â Â Â Â Â return Color;
Return the base color if the vertex coordinates arenât in the 0-1 range of the texture coordinates.
382 Â Â Â } else {
383 Â Â Â Â Â float comp;
384 Â Â Â Â Â if(parthf) {
385 Â Â Â Â Â Â Â // self-shadow mode2Â
386 Â Â Â Â Â Â Â comp=1-saturate(max(IN.ZCalcTex.z-tex2D(DefSampler,TransTexCoord).r , 0.0f)*SKII2*TransTexCoord.y-0.3f);
In self-shadow mode2, take the Z value from the shadow buffer sampler and compare it with the Z value of the vertex, if the Z of the vertex is small, it isn't a shadow. If the difference is small (close to the beginning of the shadow), the shadow is heavily corrected. (Weak correction in the upward direction of the screen?) Weakly corrects the base color.
387 Â Â Â Â Â } else {
388 Â Â Â Â Â Â Â // self-shadow mode1
389 Â Â Â Â Â Â Â comp=1-saturate(max(IN.ZCalcTex.z-tex2D(DefSampler,TransTexCoord).r , 0.0f)*SKII1-0.3f);
390 Â Â Â Â Â }
Do the same for self-shadow mode1.
391 Â Â Â Â Â if ( useToon ) {
392 Â Â Â Â Â Â Â // toon application
393 Â Â Â Â Â Â Â comp = min(saturate(dot(IN.Normal,-LightDirection)*Toon),comp);
In the case of MMD models, compare the degree of influence of the shade caused by the light with the degree of influence caused by the self-shadow, and choose the smaller one as the degree of influence of the shadow.
âą min (x,y): select the smaller value of x and y.
394 Â Â Â Â Â Â Â ShadowColor.rgb *= MaterialToon;
395 Â Â Â Â Â }
396 Â Â Â Â Â
Multiply the self-shadow color by the toon shadow color.
397 Â Â Â Â Â float4 ans = lerp(ShadowColor, Color, comp);
Linearly interpolate between the self-shadow color and the base color depending on the influence of the shadow.
398 Â Â Â Â Â if( transp ) ans.a = 0.5f;
399 Â Â Â Â Â return ans;
400 Â Â Â }
401 Â }
If translucency is enabled, set the transparency of the display color to 50% and restore the composite color.
403 Â // techniques for drawing objects (for accessories)
404 Â technique MainTecBS0 Â < string MMDPass = "object_ss"; bool UseTexture = false; bool UseSphereMap = false; bool UseToon = false; > {
405 Â Â Â pass DrawObject {
406 Â Â Â Â Â VertexShader = compile vs_3_0 BufferShadow_VS(false, false, false);
407      PixelShader  = compile ps_3_0 BufferShadow_PS(false, false, false);
408 Â Â Â }
409 Â }
Technique performed on a subset of accessories (materials) that donât use a texture or sphere map during self-shadowing.
âą âobject-ssâ: object when self-shadow is disabled.
âą UseTexture: true for texture usage subset.
âą UseSphereMap: true for sphere map usage subset.
âą UseToon: true for PMD model.
 9. FINAL NOTES
For further reading on HLSL coding, please visit Microsoftâs official English reference documentation.
5 notes
·
View notes
Text
đđđ đđ đ»đđđđĄđ - đ¶đđđđđđđ đđđđđ , đđđđđđđ đ
đđđ đ„ đđ¶ - đ¶âđđđĄđđ 1 : đđđđđđ
Masterlist
Rating: Mature
Summary:  đŽđđđđ đđđŁđđ đđđđđđđđ âđđđ đđđ đ€đđđđđđ đđ đđđĄđđđđđđđđđ, đđąđĄ âđđ đđđ đđđ đĄđ âđđđ đđĄâđđđ đđđ đąđđĄđđ đđ đĄđđđąđđđ đĄâđđĄ đđđđĄ âđđ đ€đđĄâ đđđĄđĄđđ đđđĄđđđđ . đđđĄâ đĄâđ đ đąđđđđđĄ đđ đĄâđ đ”đŽđ đđđđđđŠ, đđđŠđđ đ âđ đđđ đđđđđđđŠ đđđđđ đĄđ âđđđ đĄâđ đ€đđąđđđ đđ đĄâđ đđđ đĄ.
Fandom: Criminal Minds
Pairing: Spencer Reid x OC
Status: Ongoing
LONG TERM ONGOING PROJECT :)
My writing is entirely fuelled by coffee! If you enjoy my work, feel free to donate toward my caffeine dependency: will work for coffee
đŸđđđđđđđ: đșđđđđđđđđŠ đđđąđđĄ đđđđĄđđđĄ, đ€đđĄâ đ đđđ đĄđđđđđđđđđ đĄâđđđđ đđ đ€đđĄâ đĄâđ đ âđđ€. đđđđđ đ đđ đđ€đđđ đĄâđđ đđđđ đđđđ đđđŁđđđđđ đđđ đđ đđ đđąđđđđ, đâđđđ đđđđąđđĄđđđ & đ đđ„đąđđ đđđąđ đ đđ đĄâđđ đđ đĄâđ đđđĄđąđđ đđ đĄâđ đ”đŽđ'đ đ€đđđ. đŒđĄ đđ đđŠ đđđĄđđđĄđđđ đĄđ âđđđđđ đĄâđđ đ đđ đ đąđđ đđ đđđđđđąđđđŠ đđ đđđ đ đđđđ, đđąđĄ đđ đĄâđđđ đđ đđđŠđĄâđđđ đĄâđđĄ đŠđđą đđđđ đđđąđđ đđ đđđđđđŁđđ đ€đđĄâ âđđ€ đĄâđđ đ đđđ đđđđđđđ, đđđđđ đ đđđĄ đđ đđđđ€.
EáŽÉȘsáŽáŽ
áŽ: PÊᎠSáŽáŽsáŽÉŽ 1
Chapter One
The smell of stale coffee was overpowering, almost more so than the lack of daylight in the room. Overworking was common practice here and as such, caffeine addiction was deeply ingrained into our culture. I was relieved to have my own office to allow me to focus away from the highly strung teams that depended on me. Topping up my teacup from the pot, I continued to scan my eyes through the various databases in search of something to provide us with a lead.Â
My eyelids were heavy from exhaustion as I battled to hold my concentration and almost jumped out of my chair as a shrill phone rang from the back of the desk. It was a secure line that Iâd set up a while ago, but seldom ever used and I quickly got to my feet to close the door before picking up the handset.
âWonderland.â The careful voice announced a code that prompted a smile to fill my lips and I was immediately flooded with a warm sense of appreciation as I recognised them.
âWell, itâs been some time since I heard that.â I muttered curtly, settling back into my seat and a small satisfied sound on the other end of the line seemed to indicate that she was relieved to have confirmed my identity. âYouâre certainly working late, arenât you?â
âYou know how it is. Monsters to catch.â Penelope sighed, allowing me the chance to catch the fatigue in her voice that likely would have been easily missed by anyone else. âSpeaking of which, I have a problem.â She confessed, causing me to lean forward with riveted interest.Â
âWeâve got one of your boys causing mayhem over here. Your agents are keeping suspiciously schtum on it, territorial as ever. My team wants to stick with the case. I canât ask for official permission to access Interpol files when weâve been told to drop it, but if I could just get five minutes in the systemâŠâ She trailed off suggestively and I chuckled lightly to myself.
âYouâre losing your touch, My Queen. I would have expected you to have found your way in already.â I teased, referring to the name that Iâd once known her by as I set to work on bringing up the files of the case. I didnât need specifics to clarify, I knew exactly which criminal my team was after that had required them to fly to the States to investigate and I heard her sigh.
âAny other time, you know that I would hack first and deny knowledge later, but I canât plead ignorance on something that weâve already been clearly told to stay away from.â She groaned, her frustration obvious even over the phone and I began typing rapidly as a plan formulated in my mind.
âI canât allow access to a foreign agency, my dear. The other techs would rat me out before youâd even found anything.â I explained, chewing on my lip as I entered line after line of code in an attempt to outsmart my colleagues with techniques that would never be used in an official capacity. âHowever, if a back door was accidentally left open during maintenance, I couldnât be held responsible for anything that might creep in.â I thought aloud as I sped through screens with a wicked smile, knowing that I could easily avoid any blame this way.
âIâve assigned all of the correct credentials. Anything that you access will appear as if itâs me doing admin in the case, but be careful. Itâs a very limited disguise that wonât last long if anyone starts to dig at it. I can keep our techs busy for a little while, but you know that subtly isnât my strong suit. Get out before youâre noticed.â I instructed as I entered the last few commands and could already hear her making preparations for the task.
âOh, sweetheart. I will be gone before they even know what hit them.â She breezed, her voice filled with a confidence that was contagious and I took a deep breath as I allowed it to pass onto me.
âLaunching now.â I confirmed, as I entered the final key and my screen began filling with pages of data. âMake it worthwhile. Catch the bastard.â
--â„--
âYou think itâs an inside agent?â
My sector chief stared me down with an intensely disbelieving expression and I struggled to keep my nervousness from showing. It was incredibly nerve wracking to suggest this in a team where I was already the odd one out, but I couldnât allow my own insecurities to prevent us from getting justice. Ever since I had started here, Iâd had the deep rooted feeling that none of my colleagues approved of the decision to recruit someone who should have been arrested and the disdain always seemed most powerful when addressing the man before me.
âThat is my belief, Sir.â I answered, hoping that he wouldnât notice my false confidence and I forced myself to hold his gaze as he crossed his arms. âValeno has successfully dodged every digital trap that Iâve laid. Heâs avoided multiple agencies now and essentially vanished without a trace. It wouldnât be possible to achieve that without inside knowledge of the measures that we are taking.â I explained, carefully presenting my theory and he remained unmoved as he studied me, causing me to gulp in discomfort.
âFind me proof, Hawthorne. I canât call a witch hunt based on your bitterness against this team.â Shepard responded coldly, turning his back on me without another word to take a seat at his desk and I gulped down the anger that I felt at this unfair accusation. When I remained rooted to the spot in confusion, he simply gestured for me to show myself out.
I stomped back to my office with my cheeks burning in humiliation and closed the door behind me so that I could sink down against it with my face in my hands. It was endlessly frustrating to have invested so much time into pursuing this trafficking ring, only to have them continuously avoid capture. I was exhausted by this case, unable to remember my last day off since I was assigned to it and I could feel that I was running out of ideas, as I faced a heart breaking lack of support.
A loud alert on my computer pulled me from my wallowing and I rushed into my seat with interest. Flashing on my screen was a warning that made my stomach lurch and I stared at it with wide eyes. After months of silence, someone had finally accessed one of the booby trapped files.
I jumped into action, entering commands at lightning speed and became determined to capture whomever it was that had been defeating me for many long months. They were quick, evading my tactics with ease and I found myself engaged in a maddening game of digital hide and seek in the various systems. It was immediately clear that I was dealing with a professional and I cursed under my breath as I strained to keep up with them, setting fresh traps as I worked in the hope of preventing their escape.
âOh no you donât, you little bugger! Not today!â I hissed under my breath, feeling sweet on my brow as I fretted that one wrong move could lose them forever.
As I watched them evade my tracking, then jump straight into blocking me out with expert knowledge, a memory stirred in the back of my mind. The theory rapidly took hold and I wasted no time in throwing out a manoeuvre that I knew would cause a bolt of familiarity if the culprit was who I suspected. Barely moments later, my screen fizzled in error, before displaying a bizarre Tetris overlay that could only be the signature of one person and I gasped, reaching for the phone in a fluster. I could hardly dial the number properly with my shaking hands and stumbled over my words as they answered hurriedly.
âTell me that youâre the person flooding my system with Tetris right now!â I spat, cutting her off before she could even speak and I heard the noisy typing on the other end rapidly stop.
âI knew it! Youâre the only hacker I know that would use Alice in Wonderland riddles to throw me off!â Penelope gasped with excitement and I released a breath that I didnât even know I had been holding in relief. It had been almost six months since I last heard her voice and although I would usually be pleased to speak to her, I couldnât help a wave of disappointment at the realisation that I hadnât caught anyone connected to the case.Â
âWell done, Reid! I just almost hacked our best chance of cracking your sicko lady thief. Thank god for your crazy book memory.â I heard her chatting to someone in the background and cleared my throat to regain her attention.
âIâd recognise those moves anywhere. Youâre the only person that aggressive in code. What are you doing in my case?â I asked, rubbing at my temples in confusion and though I could hear someone barraging her with questions about who she was talking to, she remained obediently focused on me.
âYour case? Iâm researching an abduction for the team.â She revealed, sounding as if she hadnât even realised what was happening yet and I felt my eyes widen in horror. âWait a second. Interpol is on this?â She breathed, causing sounds of shock to echo from whoever was in her background and I leapt to my feet as my thoughts bounced around in my mind.
âPenelope, I need you to send me everything you have. Right now! I think youâve got a much bigger problem on your hands than you even know.â I ordered, flicking through the case files that covered my desk in a fluster and without a moment of hesitation, the details began to pop up on my screen. As I acknowledged the matching signature to the photos on my desk, I held a hand to my mouth in shock.Â
âValeno.â I whispered as adrenaline shot through my entire body and I grabbed my phone as I began marching through the halls of the office. âGet your Unit Chief ready for a call. Iâm taking this to the chief now.â I instructed, before hanging up and striding confidently back toward the office that Iâd been so rudely dismissed from a short while ago.
Without awaiting permission, I barged inside and switched on the monitor on the wall. I flicked it to the channel that connected to my station and displayed the crime scene photos that were sent by Penelope. At first, Shepard seemed irritated by the brashness of my approach, but as I flicked through the photos with a determined energy, understanding dawned on his features.Â
âLook familiar?â I suggested, before spreading the case files out on the table to further explain my point. âEvery single detail, exactly the same. Matching victimology, no forensics detected at the scene, no ransom. Heâs back.â I insisted, pointing out each fact with a passion that I had rarely allowed to show since I joined this role and he stared up at the screen with conviction.
âWhere?â
âVirginia, United States. Itâs just like I said. Heâs trying to hop jurisdiction to exploit the black spots between agencies and avoid detection.â I explained, feeling equally validated and frustrated that my theory had been correct. This meant that more lives had been destroyed by this man and the realisation caused a wave of nausea to wash over me.
âSir, the FBI are late to this. Theyâre only just linking the abductions together. We could get ahead of him this time, if we assisted.â I appealed, holding my hands out in exasperation and he finally tore his eyes from the screen to examine me with an unreadable face.
âYou have contact with the team?â He enquired curiously and I nodded in confirmation.
âYes, Sir. Their Unit Chief is waiting for your call.â I presented, afraid to move a single muscle for risk of damaging the fragile alliance and after a few minutes of tense silence, he finally sighed.
âSet up the meeting through Lucy.â He instructed, causing me to almost run from the room to find our case coordinator, before he called out to stop me.
âIf they accept our support, weâll be heading over to join the team.â He stated calmly and I held my breath as I paused on the spot. âYou know his patterns better than anyone. You think you can catch this mole?â
âYes, Sir.â I asserted, the words escaping my lips before Iâd even had the chance to consider them and for the first time, he softened his expression as he viewed me.
âThen youâd better get ready to travel. Youâll be joining us.â
--â„--
There was something awfully intimidating about entering the FBI headquarters, despite my own professional background. I hesitated at the back of the team, unsure about the etiquette of these kinds of scenarios as Iâd never worked on a shared agency case in person, unlike the others. My skills were shared between multiple Interpol teams and as a result, I didnât have a bond with anyone, especially not in the group that Iâd travelled with.
The rest of the agents seemed completely at home as they stepped out of the elevator and made a beeline for the main BAU office, whilst I slowed to a stop in the reception, feeling too overwhelmed to continue.
Everything felt immensely bigger than I was used to and the entire process had been so rushed that Iâd hardly had a chance to process any of it. This was my first visit to the States at all in many years and I couldnât catch up with the emotional effect of this. I brushed the hair that had escaped my scruffy braid out of my face and attempted to straighten out my clothes, convinced that I appeared completely haggard after being made to attend the office straight from a long flight.Â
I took a deep breath and forced myself to push open the glass doors to join my team, only to be ambushed before I could.
âWonderland!â An excitable voice cheered from my side and I turned to find a very colourful woman with a huge mane of bright blonde hair barrelling toward me.
âPenelope Von Troublemaker.â I chuckled as she swept me into a hug that almost knocked me from my already unsteady feet. She smelt fruity and fresh as she squeezed the life out of me and I wheezed for breath.
âItâs wonderful to see you at last, but I really do need to breathe, my dear.â I wheezed, prompting her to quickly jump back, leaving only her hands on my shoulders as she beamed at me.
âYouâre so pretty, even more so in person! And classy too. Look at you in your head to toe black chic. I love it.â She breathed, scanning my full appearance with excitement radiating off her in waves and I managed a nervous smile in response. I still remembered Penelope as a late teen with an attitude problem, who often engaged me in fierce competition and was glad to find that adulthood had encouraged her to embrace her natural sparkle.
âSir. Iâm going to take Alice straight to my office to set up, okay? Cool. Thanks.â She called over toward the rest of the office carelessly, pulling me along behind her without even awaiting a response and I couldnât help giggling to myself as I followed.
Penelopeâs office was exactly what I had imagined, full of the same sunshine attitude that she radiated. Her desk was littered with cute figures, bright photos and pen pots filled with fuzzy tipped biros. She had more screens mounted all over the walls than I could even have applied for permission for at Interpol and I noticed that there was a small space at one end of the surface which was completely clear, with a second office chair tucked under it.
âYouâre gonna be right here, next to me. If weâre gonna beat this guy, I figured we should team up, just like the old days.â She explained with a fond smile and I felt myself relaxing already in her warm company. âJust go ahead and drop your laptop stuff there and then I can show you where to store the rest of your stuff until we can take you to your hotel.â She instructed, indicating to my small case.
âOh. This is everything that I have with me.â I answered bashfully, shuffling inside to begin setting up my laptop in the space indicated, whilst Penelope studied me with confusion.
âYou donât have a separate go bag?â She investigated, seeming thoroughly shocked by how little I had with me and I shrugged in return.
âI donât really âgoâ, Penelope.â I chuckled as I busied myself with plugging everything in and noticed that her expression seemed concerned now. âI didnât have much notice. I grabbed all of my casefiles and the laptop stuff that I could carry, and then just whatever personal items I could think of. Youâll have to forgive me if Iâm not as well presented as you expected. I donât travel much nowadays.â I added, hoping that this would settle her curiosity but instead she took a seat beside me and scrutinised me thoroughly.
âWhy not? Hotch and Gideon even send me out when Iâm needed and I donât have a psychology degree.â She commented as she raised her brows at me, as if she needed to remind me of my own education and I simply shrugged avoidantly. âAre Interpol not making good use of you, Sugar? Whose career do I need to ruin?â She threatened, already seeming riled by the idea and I smiled at her protectiveness.
âItâs more complicated than that. Letâs crack this case and then we get into personal catch ups later, okay?â I suggested, much to her disappointment and my relief, we settled into work.
We spent the next hour comparing information and plotting possible traps, whilst Shepard and the rest of my team busied themselves with integrating into the local team and finding places to work. It was refreshing to work alongside someone for a change and having her input prompted new ideas.Â
After a while, the phone rang and Penelope leaned over to answer it on speakerphone so that she could continue examining the document that was between us.
âHey, baby girl. Tell me youâve got some good news for me.â A deep voice crooned, causing my attention to bolt upward in confusion, only to find Penelope smirking in satisfaction.
âWell, gorgeous, that I do. I am still 100% single, interested and available for dinner whenever you need.â She announced, earning a chuckle from the man on the line and I quirked a brow at her inquisitively.Â
âAlso, the Interpol team just arrived. So I have now upgraded to the office of Unrivalled Girl Power, baby.â She added with delight and I felt my curiosity growing by the minute as she spoke.
âThat was fast. Alright, Gideon and I will make our way back now. This lead was a bust.â He confirmed, revealing that he was actually a member of her team and I raised my brows in surprise at her demeanour.
âDonât keep me waiting too long.â She drawled in a husky tone, before hanging up and attempting to return to her work with a smile. I playfully slapped her hands away from the files and thinned my eyes at her.
âIâm sorry. Are you really going to try to gloss over that? Explain, please!â I insisted, leaning closer to her in fascination and she chuckled as she shook her head dismissively. âWho was that, Penelope?â I gasped, feeling as if I might explode from the mystery and she attempted to swat me away.
âThat was Derek Morgan. Heâs one of the behavioural analysts and a dear friend.â She began, only causing my brows to raise impossibly further and she leaned closer into me to lower her voice. âHeâs also an absolute hunk of a man, but youâll see that in a bit.â She winked, suppressing a cheeky cackle and I felt my mouth drop open in shock.
âAnd do you speak to all of your dear friends like that, or is he special? He sounds special.â I hissed, grilling her like an excited teenager again, falling straight back into our old friendship as if no time had passed at all and she laughed heartily.
âOh, please. Nothing wrong with a little workplace flirtation to brighten the day. You know what I mean! You work in Europe. Youâre surrounded by romance.â She muttered dismissively as she returned her attention to her screens and I shook my head.
âI absolutely do not! You have no idea, Nels. Europe is not inherently romantic.â I drawled, rolling my eyes at her ignorance and she glanced back at me with a disbelieving smirk.Â
âIâm serious. Most of the men at Interpol either think that theyâre the real life incarnation of James Bond, collecting as many Bond girls as possible, or theyâre married to their work. Itâs honestly rather dull. Nothing like you and Agent Morgan. That conversation sounded like a lot more than just playful banter to me. You two have chemistry!â I insisted, a smug smile filling my lips as I viewed her and she quickly brushed me off.
âDonât be ridiculous, Ally.â She breathed, the slightest hint of a blush filling her cheeks and I made a mental note to revisit this later. âWe need to get our strategy ready. Hotch will call a meeting once Morgan and Gideon get back.â
âGarcia, do you have the suspect list that I asked you to pull? I want to run through it before the meeting.â A voice interrupted from the door and I turned to find a tall, slim young man waiting keenly.
âOh, yes! Let me just get it printed for you.â Penelope answered, pushing the files aside to access her keyboard. âIâm sorry. I got distracted with helping Alice to settle in.â She added, before glancing at me to realise that he and I were looking at each other with interest.
âAh, of course! Alice, this is Dr Spencer Reid, one of our behavioural analysts. Reid, this is Alice Hawthorne from Interpol. Sheâs a very old friend.â Penelope introduced with a flourish before returning to her task and I jumped to my feet to shake his hand, only for him to wave awkwardly instead. Though it was an interesting quirk, I was relieved as I hated contact with strangers and was glad that I would have at least one less person that I would have to endure it with.Â
âYouâre the hacker with the Alice in Wonderland riddles?â He offered, viewing me with interest and I felt my cheeks turning pink the moment that I met his hypnotising hazel eyes. Now I recognised his voice from my call with Garcia and remembered the questions heâd been asking in the background after his rather embarrassing introduction to my work.
âUm, yes. I donât usually do that. I had a feeling that it was Penelope and I knew that she would understand that. Or at least, I hoped that she would.â I offered, shuffling awkwardly on the spot and he nodded in understanding.
He was dressed remarkably smart compared to many of the other FBI agents, but still had an air of his own style about him that interested me. His neatly combed back hair complimented his handsomely chiselled features and I soon found myself struggling for something to say. I already considered myself to be somewhat socially challenged, but especially when it came to interacting with someone who was unexpectedly well matched to everything that I found attractive, I was a hot mess.
âItâs an interesting choice. Did you know that it was actually rated as one of the world's most influential novels? Itâs full of nonsensical rhymes that could easily be used to create a rather complicated ruse, as not many people would know the answers offhand.â He complimented in what I assumed was an attempt to make me feel better about my rather childish tactics and I smiled in appreciation of his kindness.
âThereâs actually a rare disorder named after it that is characterised by distortions of visual perception, the body image and the experience of time.â He began to explain and I couldnât keep myself from interrupting him in excitement.
âToddâs Syndrome, or dysmetropsia, right?â I interjected, causing his brows to shoot up in surprise and Penelope actually paused what she was doing to turn and look at us both in disbelief. âIâve never encountered it, but I did study it in depth. It was actually the topic of my dissertation at university.â I clarified, causing a wide smile to spread across his face as he considered me and I could feel Penelope smirking at me from my peripheral vision.
âYour suspect list, Reid?â She offered, clearing her throat as she leaned toward him with a piece of paper and he seemed to shake himself back to reality to take it.
âYes. Thank you, Garcia.â He blustered, taking it from her in a distracted manner and running his finger down the page at an impossible speed to have actually read anything. Before I could question this, he glanced back up in shock and his posture became alarmed. âI have to get this to Hotch. Nice to meet you, Alice!â He called as he rushed from the room and left Penelope and I silently communicating our confusion to each other.
âOkay, Miss Thing. You canât sit there and tell me that you donât flirt at work, and then do that right in front of me.â She crooned, looking after Reid with a mischievous appreciation and I turned back to face her with a shrug, hoping that I could conceal my bashful reaction. âOh, you want to play innocent? I see how it is. Well, Iâm not fooled. I know what I saw and Iâm not one to forget.â
#warofhearts#criminalminds#oc#fanfic#fanfiction#writing#Alice#Alice Hawthorne#original character#Spencer Reid#Penelope Garcia#Jennifer Jareau#Aaron Hotchner#Jason Gideon#Derek Morgan#Elle Greenaway#Emily Prentiss#David Rossi#spencer reid x you#spencer reid x reader#spencer reid x oc#spencer reid fanfiction#spencer reid criminal minds#spencer reid series#criminal minds fanfiction#criminal minds oc#criminal minds insert#matthew gray gubler#matthew gray gubler fanfiction
6 notes
·
View notes
Text
Interpolation Techniques Homework Help
https://www.matlabhomeworkexperts.com/interpolation-techniques-assignment-help.php
Students can even avail our online tutoring sessions at reasonable cost. Our Matlab Interpolation Techniques tutors are capable of teaching all kind of techniques in the simplest way so that you can understand it easily. Interpolation Techniques Projects and assignment solved by our experts have helped many students across the world to achieve higher grades. Our main motto is to provide constant guidelines to the students in the subject area and make them learn the concepts in simplistic yet succinct manner. We frequently receive requests from students for help with the below mentioned topics and types of Interpolation Techniques. Hence, we have a team with expertise in these topics to help you out. Linear Interpolation and Geometric Ratios Multivariate interpolation NewtonâCotes formulas NIST Digital Library of Mathematical Functions Piecewise Linear Interpolation Polynomial evaluation & interpolation Repeated Linear Interpolation Simple rational approximation Two-dimensional Interpolation
#Interpolation Techniques Homework Help#Interpolation Techniques Assignment Help#Interpolation Techniques Homework Help Experts#Interpolation Techniques Project Help#Interpolation Techniques Online Help#Online Interpolation Techniques Homework Help#Interpolation Techniques Help
0 notes
Note
đLayla ?
10 Facts About Layla Birch-Fang
She has a very strong sense of responsibility. As the eldest of her siblings, she sees herself as the protector of her younger brothers and sister and tends to want to keep an eye on them, whether they want it or not. She also feels responsible for preventing any bad thing she sees happen due to her precognition abilities, even though she canât always change things. Part of her character arc is learning to both let her siblings have some breathing room and to understand that just because she knows something is going to happen, it doesnât mean itâs her fault if she just canât do anything to prevent it. Sheâs not omnipotent.
Sheâs very neat and well organized and grows anxious when things are disordered in her space. She has a habit of cleaning or organizing when sheâs stressed out, be it her own home or other peopleâs.Â
She has a very professional dress sense, fitting right into the Interpol dress code before she even was recruited. She typically wears blazers and skirts and always carefully coordinates her outfits (out of the various dark colors allowed by the Interpol dress code.)
After years of practice, she was able to hone her precognition abilities to the point she can choose what kind of vision sheâs going to have. This is most useful for her work in the Ultra Beast department of Interpol as sheâs able to accurately predict when and where wormholes will open up, allowing her team to be able to minimize damage caused by rampaging and lost UBs.
She works mainly with the Faller version of her mother, who she refers to as her Aunt off-duty. Faller Anabel has come to see Layla as a sort of surrogate daughter herself over time as well. Layla also ends up working a lot with Ohia, who himself is a Faller and a sort of on-call employee for Interpol, assisting their studies of UBs and the nature of wormholes in Alola along with Weiss, who has psychic powers that allow him to manipulate Ultra Wormholes. She strongly dislikes working with the two younger men however, due to their laid back attitude towards the work, since if thereâs one thing Layla hates, itâs not taking serious matters like UBs with the gravitas they deserve.Â
She first met her boyfriend, Caleb, through their mutual friend Opal. At first she hated his flirtatious and habit of joking around, but over time as she got to know the âtrueâ Caleb under the front he puts up she came to like him, eventually falling for the crafty and well-read, but still fun and cheerful man. He often jokes that heâs the only one who can get Layla to laugh (which is not true, but she does find herself more at ease around him, able to relax and not worry as much about her responsibilities.)Â
Sheâs also childhood friends with Rudy and Safir, Roxanne and Brawlyâs twins. However, sheâs not as close with them when sheâs older due to an incident when they were children. The three were exploring near Granite Cave when Layla had a vision of them being attacked by scary looking wild pokĂ©mon. She immediately ran back down the beach to where Brendan (who was babysitting them) was, but while she was fetching him to help, her friends accidentally disturbed an injured wild Salamence that was hidden in the foliage. Brendan was able to subdue the dragon with the help of his pokĂ©mon, but Rudy still ended up getting a fairly serious injury that left a prominent scar. Since that event, Layla has blamed herself for Rudyâs injury and been somewhat distant from her old friends, especially since while Rudy holds no grudge towards her, knowing she was only trying to help, Safir feels a bit bitter that Layla left them to tell an adult rather than just warning them that there was danger so they could have avoided the entire situation.
She used to love watching her father (Wally)âs gym battles as a child, wanting to be like him when she grew up. Originally, she would be brought to watch by Anabel or Brendan, but later on she goes by herself, visiting after school and taking notes on her fatherâs techniques. As a result, her battle style is strongly influenced by his. Despite working for Interpol currently, she still dreams of one day inheriting the Petalburg Gym from Wally.
Prior to joining Interpol, she didnât know that she would be working with a Faller version of her mother, but Faller Anabel ended up being assigned as Laylaâs mentor (definitely done on purpose by the upper management.) Layla ends up introducing her âauntâ to her mother which ends up being a very weird meeting for everyone involved.
Layla drinks a lot of tea, of all kinds. She drinks herbal tea to relax, green tea to help her focus, and black tea when she needs an energy boost. Her parents gifted her a fancy box to keep her collection in which is now proudly displayed on her desk at work. Sheâs very protective of her tea as well, not wanting to share it with anyone and going on a crusade to find the culprit if she finds any missing.
1 note
·
View note
Text
The Use of Cryo dehydratated Animal Anatomical Segments for Veterinary Anatomy Teaching-Juniper Publishers
JUNIPER PUBLISHERS-OPEN ACCESS JOURNAL OF DAIRY & VETERINARY SCIENCES
Abstract
The veterinary anatomy teaching in the Veterinary Schools around the world has been supported by diverse kinds of innovations such as digital resources and applications, being the use of cadaver's dissections very limited. In the present study, the use of cryo dehydrated anatomical pieces of musculoskeletal structures from large and small animals were experimented for five years and the vantages and advances were computed. The material was prepared using a fixation of fresh material with 10% formalin followed by dissections and freezing-unfreezing sequences until completely be dried. Paints were performed to give a natural appearance. The material produced, including complete limbs from large and small animals had a good quality and preservation of the structures such as ligaments, muscle mass, tendons and apo euroses. The topographical relationships were perfectly maintained and the pieces revealed to be a reliable material for the practical of musculoskeletal anatomy. The method used was easy applied, very cheap, of stress-free manipulation and storage and due to its high durability reduced the discharge of biological wastes and chemical products. The students were very friendly to this kind of material what reflected in high coefficient of approval during the practical examinations. The experience of create additional assignment to teaching how to prepare the material was successful and have been a great integration opportunity to students from last years of the course work together and share experiences with the beginners, when they check anatomic contents considering the applicability in clinic and surgical practices.
Keywords: Â Veterinary anatomy teaching; Criodehydration technique; Cadaver conservation
  Introduction
It is unquestionable that the recognition of gross anatomical structures and their normal aspect and positioning in healthy animals is basic in the veterinary formation. Most of Veterinary Schools around the world have at least two teaching assignatures at the curricula about gross anatomy. Historically, the experience in this area counted with the use of cadaver's dissections but this subject has been much discussed [1,2] and not more employed in most universities. Although some studies suggest that dissection, coupled with associated educational activities, is an effective pedagogical strategy for learning [3,4], nowadays, the use of digital ways to explain this subject is more frequent to describe muscular structures [5], being the cadaver's use very limited. Unlike osteological material, that has a permanent character and is available in the laboratories and museums where is stored with relative easiness, muscular structures are expensive to preserve. Many techniques are known to keep this kind of material for long time [6], but most of them implies in high cost of maintenance in suitable tanks. Digital lectures [7] and three-D applications [8] based on reconstructions of series of drawings are very popular and helpful among students because the applications allows to put or remove layer of muscles and they provide basic knowledge about spatial anatomy [9] but, in most of them the interpolation of data is far of the natural structure and students have difficulty when in front an anatomical piece in the laboratory. Synthetic models are also useful, but have an expressive cost of acquisition and are not so attractive and interesting to the student as the laboratory practices. The gross anatomical dissection is a crucial part of the education of veterinarians because it takes important manual skills, anatomic knowledge as well as an understanding of spatial relationships of structures and organs of the body.
Reasons because practices with cadavers are more scarce includes the low availability of dead animals intended to this kind of practices once animals are protected by international laws that forbid the euthanasia for education or investigation studies, the use of toxic substances to preserve the material, and need of appropriated facilities to storage the anatomic pieces, mainly treating of large animals, in which the discharges of biological fluids and of conserving may a terrible source of environmental contamination.
At South Brazil, many veterinary schools are focused mainly in farm animals due to the expressive impact of the cattle farming in the country economy. Other areas of veterinary medicine, including large animals, are also imperative, such as the equine clinics and sheep and swine productions. At Federal University of Pelotas, classes of veterinary anatomy are offered in the first and second semesters of the curriculum, and composed by a total of 136 hours each one, with half charge in laboratory practices. The assignment entitled Anatomy of Domestic Animals I (ADA I) receive annually about 160 academics and has focus in the musculoskeletal structures, being comparative among domestic species including both, large and small animals. Parts of bovine carcasses are obtained by donation from slaughterhouses located near the University, but just corporeal segments without economic value are available. Other source of biological material is the Animal Hospital of the university, where the bodies of dead animals (small and large size) free of infectious and zoonotic diseases are carried to the Anatomy Sector to classes use. Given it is a public institution the university has limited financial resources, including those destined to buy synthetic models to contemplate all students and, a small number of technicians to support the laboratories. Considering the high number of students supported and the suitable quantities of material in the labs, the logistics is compromised due the space to storage and posterior discharging of cadavers. To become viable for classes including until 75 students simultaneous at the laboratory, since the two last decades, we have applied and developed anatomical techniques10 to solve these problems. This work has as aim to present these techniques as well as to show how they have worked which good results in the veterinary teaching.
  Material and Methods
Fresh cadavers from equines, bovines and dogs were used in present study, the carcasses were obtained from animals that dead from the Animal Hospital from University of Pelotas. Cryodehydration technique described by previous work [10] was incremented with paintings and resin layers, being produced anatomical pieces with superficial and deep muscle dissections. A quick pass-by-pass is exposed here:
a. Fresh carcasses were used immediately or frozen at -20 °C for further studies;
b. Carcasses with complete integrity (i.e., without missing parts and without skin without sections) were preferably perfused with 10% formalin dilution using the external carotid artery only on one side of the neck. The solution was spread on the carcass through the vascular system using the pressure produced by raising a deposit of formalin solution 2.5m from the ground. The expected positioning of the head and limbs should be provided prior to this step because the fixed tissues are difficult to move after the process. The infusion is complete when muscle stiffness is verified and a foamy discharge is released from nares.
c. When only body parts are available, multifocal perfusion through muscle tissues and joint cavities is conducted using manual injection with needles attached to 50ml syringes. The infusion is complete when, after all the tissues are exposed to the fixative solution, including the deeper ones. In addition, the material should be immersed in 10% formalin tanks within the next 48 hours
d. In both cases, the carcasses are covered with plastic material and stored at room temperature (8-220C) during 24 to 48h.
e. Carcasses are washed with tap water and dissected as a convenience of the proposed class, using traditional anatomic dissections techniques for musculature visualization.
f. At this stage, the anatomical pieces can be used in some lab classes, taking account the partial volatilization of the formalin, with could be irritant to eyes and nose mucosa.
g. Dissected anatomical pieces pass by at least twenty cryodehydration sequences, including freezing at -200C and unfreezing at 10 to 220C, air relative humidity around 7080%. After that, the material is kept out of the freezer until its complete drying, with is verified when a fine paper towel compressed in the structures is removed completely dry.
h. A new careful debriding of tissues using the bistoury to produce scrapping and removal of dry fascia and perimysium will expose better the muscular fibers. Removals of periosteum in some parts are also need and will grant a better recognition of the structures.
i. Finally, structures can be painted with gouache painting combination colors closer to the natural appearance of a fresh tissue. Three coatings of gum composed by acetate of polyvinyl are used. First coating with 50% of dilution in water and the next two using the pure glum. Drying time is about 12 hours.
Teaching application of the Material
For ten semesters, this kind of material have been used in the ADA I and was freely available at labs, which were opened at least 2 hours daily for extra-class study. In some pieces, structures were permanently labeled. Other kind of gross anatomical pieces included those fixed in formalin and cryodehydrated metameric sections.
The presence of students in the labs was monitored as well as the use that they made of the material and the general conservation of the anatomical pieces. The performance of the students in subsequent examinations was computed.
  Results
Dehydrated anatomical pieces completed about 70% of the material used in practical classes of veterinary anatomy during the study. Some students complementary used digital applications but synthetic models were not available by the university. The material produced, including complete limbs from large and small animals had a good quality and conservation of the structures such as ligaments, muscle mass, tendons and aponeurosis. The topographical relationships are perfectly maintained and the material revealed to be a reliable for practical classes of musculoskeletal anatomy. Some regions of clinical concernment, as ligaments from distal end of equine and bovine limbs, including all sesamoids suspensory apparatus were very distinguishable. Thorax and abdomen of dogs also provide excellent visualization of the muscles. All dehydration pieces were storage in cabinets with glass door (Figure 1) and are freely used on laboratory tables by the students. Anatomical pieces are odourless, dried and resistant to the touch. The students are friendly to use it because they are very didactic, easy manipulation and learning. The frequency in the practical laboratory was increased even when technicians were not available to provide material. Topics with a good visualization of structures using cryodehidrated anatomical pieces.
Head of equine and dog
We used to divide using an electric band saw, the head in left and right antimeric sections, with allowed visualization of strutures in the medial and lateral aspect. In the medial aspect, it is appreciable to visualize the encephalon, ethmoid turbinates, nasal turbinates, hard palate, tongue, pterygoid muscle, parts of hyoid apparatus, digastric muscle (in horse). From lateral aspect is visualized most of face musculature, deep or superficial structures according to the previous dissections (Figure 2).
Limbs of equine, bovine and dog
The relationships among muscles and articular structures are clearly verified (Figure 3). Hooves were also sectioned and internal components and their exact positioning and syntopy can be easily studied (Figure 4).
Thorax and abdomen of small animals
These corporal segments were also studied in left and right antimeric sections and internal organs were removed to allow highlight to the muscles. Dogs of medium size were chosen and most of accumulated fat between muscles was manually removed prior dehydration. In general, material with less than 2 cm of thickness was more quickly dehydrated. As the most of animals used dead due to traumatic injuries, the use of hydrogen peroxide 20V, 50% diluted in water was need to promote cleaning in surfaces with blood impregnation. In practices classes, the teaching of the muscles of the trunk was also performed using wet pieces fixed with formalin what allowed the better understanding the layers of muscles.
  Discussion
Cryodehidrated anatomical pieces from the digestive tract have been applied in the some veterinary schools from Brazil [11] with satisfactory results in the anatomical teaching [12]. The methodology explained in the present study for musculoskeletal structures of large and small animals has been applied with success in the veterinary anatomy teaching at Federal University of Pelotas for more than 25 years, but the teaching experience have not evaluated and published. The great contributions of employing the cryo dehydrated material as a teaching tool are: the low cost, high durability, easy handling and storage, low toxicity, long term reduction of environmental contamination by biological and chemical wastes and the attractive effect on the students. Similar material is also produced using a plastination process [13,14] in which the water and fat are replaced by certain plastics, yielding specimens that can be touched, do not smell or decay, and even retain most properties of the original sample [11]. However, this process is not available in most Veterinary Schools around the world due to high cost of execution, once require specific equipment and chemical components. The cryo dehydration technique used in the present work have a low cost, requiring just formalin, domestic freezer (for small animals) and artisanal material for completion, easily found in hardware stores. Once prepared, the anatomical material can be stored for long time (more than 20 years) in places free of humidity and decomposers insects (coleopterans). This reduces the logistic and costs to keep large formalin tanks to store body parts. As water is lost during the process, there is a reduction of approximately 60% of the original weight of the material which became it slighter and of easy handling [10]. During the practical classes and studies extra-classes, the students are not exposed to volatile toxic substances becoming this moment more long and comfortable. Due its great durability, the same piece can be used for several years without need of replacement, which minimizes the probability of environment contamination. Lastly, the material produced in colors or in pale tones (without paints) arouses interest in students with is expressed in the low disapproval coefficients (8 to 11%) observed in the practical examinations.
Actually we counted just with a technician to general activities in the sector that receive also students of zootechny course, in a total of 600 students annually. Little skilled laborto produce the material is required. As the interest of students and curiosity in to know how to prepare the material have increasing, the Anatomy Sector decided to create an optional assignment with maximum 10undergraduates to provide specific training (Figure 5). Two classes are planned each year and, year after year, they have increasing the anatomical collection available to next students, sequentially There is a great integration opportunity to students from last years of the course to work together and share experiences with the beginners, when they check anatomic contents considering the applicability in clinics and surgical practices.
Acknowledgment
We are grateful to the students of the Complementary Formation in Morphological Science Assignment, whose have been collaborating in preparing anatomical pieces. Thanks go to anonymous reviewer for the patience to help the English language in this manuscript.
For more Open Access Journals in Juniper Publishers please click on: https://juniperpublishers.com/open-access.php
For more articles inÂ
Open Access Journal of Dairy & Veterinary sciences
 please click on:
https://juniperpublishers.com/jdvs/index.php
#Criodehydration technique#Cadaver conservation#Veterinary anatomy teaching#Juniper Publishers#open access journals
0 notes
Text
Creating Amazon Timestream interpolated views using Amazon Kinesis Data Analytics for Apache Flink
Many organizations have accelerated their adoption of stream data processing technologies in an effort to more quickly derive actionable insights from their data. Frequently, it is required that data from streams be computed into metrics or aggregations and stored in near real-time for analysis. These computed values should be generated and stored as quickly as possible; however, in instances where late arriving data is in the stream, the values must be recomputed and the original records updated. To accommodate this scenario, Amazon Timestream now supports upsert operations, meaning records are inserted into a table if they donât already exist, or updated if they do. The default write behavior of Timestream follows the first writer wins semantics, wherein data is stored as append only and any duplicate records are rejected. However, in some applications, last writer wins semantics or the update of existing records are required. This post is part of a series demonstrating a variety of techniques for collecting, aggregating, and streaming data into Timestream across a variety of use cases. In this post, we demonstrate how to use the new upsert capability in Timestream to deal with late arriving data. For our example use case, we ingest streaming data, perform aggregations on the data stream, write records to Timestream, and handle late arriving data by updating any existing partially computed aggregates. We will also demonstrate how to use Amazon QuickSight for dashboarding and visualizations. Solution overview Time series is a common data format that describes how things change over time. Some of the most common sources of time series data are industrial machines and internet of things (IoT) devices, IT infrastructure stacks (such as hardware, software, and networking components), and applications that share their results over time. Timestream makes it easy to ingest, store, and analyze time series data at any scale. One common application of time series data is IoT data, where sensors may emit data points at a very high frequency. IoT sensors can often introduce noise into the data source, which may obfuscate trends. You may be pushing metrics at second or sub-second intervals, but want to query data at a coarser granularity to smooth out the noise. This is often accomplished by creating aggregations, averages, or interpolations over a longer period of time to smooth out and minimize the impact of noisy blips in the data. This can present a challenge, because performing interpolations and aggregations at read time can often increase the amount of computation necessary to complete a query. To solve this problem, we show you how to build a streaming data pipeline that generates aggregations when writing source data into Timestream. This enables more performant queries, because Timestream no longer has to calculate these values at runtime. The following diagram illustrates our solutionâs architecture. In this post, we use synthetic Amazon Elastic Compute Cloud (Amazon EC2) instance data generated by a Python script. The data is written from an EC2 instance to an Amazon Kinesis Data Streams stream. Next, a Flink application (running within Amazon Kinesis Data Analytics for Apache Flink) reads the records from the data stream and writes them to Timestream. Inside the Flink application is where the magic of the architecture really happens. We then query and analyze the Timestream data using QuickSight. Prerequisites This post assumes the following prerequisites: QuickSight is set up within your account. If QuickSight is not yet set up within your account, refer to Getting Started with Data Analysis in Amazon QuickSight for a comprehensive walkthrough. You have an Amazon EC2 key pair. For information on creating an EC2 key pair, see Amazon EC2 key pairs and Linux instances. Setting up your resources To get started with this architecture, complete the following steps: Note: This solution may incur costs; please check the pricing pages related to the services weâre using. Run the following AWS CloudFormation template in your AWS account: For Stack namež enter a name for your stack. For VPC, enter a VPC for your EC2 instance. For Subnet, enter a subnet in which to launch your instance. For KeyName, choose an EC2 key pair that may be used to connect to the instance. Leave all other parameters at the default value and create the stack. The template starts by setting up the necessary AWS Identity and Access Management (IAM) policies and roles to ensure a secure environment. It then sets up an EC2 instance and installs the necessary dependencies (Maven, Flink, and code resources) onto the instance. We also set up a Kinesis data stream and a Kinesis Data Analytics application. Later in this post, you build the Flink application and deploy it to the Kinesis Data Analytics application. The CloudFormation template also deploys the Timestream database and its corresponding Timestream table. QuickSight is not deployed via CloudFormation; we configure it manually in a later step. The entire process of creating the infrastructure takes approximately 5 minutes. After the CloudFormation template is deployed, we have an EC2 instance, Kinesis data stream, empty Kinesis Data Analytics application, Timestream database, and Timestream table. In the following steps, we go through setting up, compiling, and deploying the Flink application from the EC2 instance to Kinesis Data Analytics, starting the Kinesis Data Analytics application, streaming data to the Kinesis data stream, and reading the data from Timestream via interactive queries and QuickSight. Throughout this guide, we refer to a few values contained in the Outputs section of the CloudFormation template, including how to connect to the instance, as well as the Amazon Simple Storage Service (Amazon S3) bucket used. Building the Flink application Letâs begin by connecting to the instance. On the Outputs tab of the CloudFormation template, choose the link associated with ConnectToInstance. This opens a browser-based terminal session to the instance.The EC2 instance has undergone some initial configuration. Specifically, the CloudFormation template downloads and installs Java, Maven, Flink, and clones the amazon-timestream-tools repository from GitHub. The GitHub repo is a set of tools to help you ingest data and consume data from Timestream. We use the Flink connector included in this repository as the starting point for our application. Navigate to the correct directory, using the following command: sudo su - ec2-user cd /home/ec2-user/amazon-timestream-tools/integrations/flink_connector_with_upserts Examine the contents of the main class of the Flink job using the following command: cat src/main/java/com/amazonaws/services/kinesisanalytics/StreamingJob.java This code reads in the input parameters and sets up the environment. We set up the Kinesis consumer configuration, configure the Kinesis Data Streams source, and read records from the source into the input object, which contain non-aggregated records. To produce the aggregated output, the code filters the records to only those where the measure value type is numeric. Next, timestamps and watermarks are assigned to allow for Flinkâs event time processing. Event time processing allows us to operate on an eventâs actual timestamp, as opposed to system time or ingestion time. For more information about event time, see Tracking Events in Kinesis Data Analytics for Apache Flink Using the DataStream API. The keyBy operation then groups records by the appropriate fieldsâmuch like the SQL GROUP BY statement. For this dataset, we aggregate by all dimensions (such as instance_name, process_name, and jdk_version) for a given measure name (such as disk_io_writes or cpu_idle). The aggregation window and the allowed lateness of data within our stream is then set before finally applying a custom averaging function. This code generates average values for each key, as defined in the keyBy function, which is written to a Timestream table at the end of every window (with a default window duration of 5 minutes). If data arrives after the window has already passed, the allowedLateness method allows us to keep that previous window in state for the duration of the window plus the duration of allowed latenessâfor example, a 5-minute window plus 2 minutes of allowed lateness. When data arrives late but within the allowed lateness, the aggregate value is recomputed to include the newly added data. The previously computed aggregate stored within Timestream must then be updated.To write data from both the non-aggregated and aggregated streams to Timestream, we use a TimestreamSink. To see how data is written and upserted within Timestream, examine the TimestreamSink class using to following command: cat src/main/java/com/amazonaws/services/timestream/TimestreamSink.java Towards the bottom of the file, you can find the invoke method (also provided below). Within this method, records from the data stream are buffered and written to Timestream in batch to optimize the cost of write operations. The primary thing of note is the use of the withVersion method when constructing the record to be written to Timestream. Timestream stores the record with the highest version. When you set the version to the current timestamp and include the version in the record definition, any existing version of a given record within Timestream is updated with the most recent data. For more information about inserting and upserting data within Timestream, see Write data (inserts and upserts). @Override public void invoke(TimestreamPoint value, Context context) throws Exception { List dimensions = new ArrayList<>(); for(Map.Entry entry : value.getDimensions().entrySet()) { Dimension dim = new Dimension().withName(entry.getKey()).withValue(entry.getValue()); dimensions.add(dim); } //set vesion to current time long version = System.currentTimeMillis(); Record measure = new Record() .withDimensions(dimensions) .withMeasureName(value.getMeasureName()) .withMeasureValueType(value.getMeasureValueType()) .withMeasureValue(value.getMeasureValue()) .withTimeUnit(value.getTimeUnit()) .withTime(String.valueOf(value.getTime())) //by setting the version to the current time, latest record will overwrite any existing earlier records .withVersion(version); bufferedRecords.add(measure); if(shouldPublish()) { WriteRecordsRequest writeRecordsRequest = new WriteRecordsRequest() .withDatabaseName(this.db) .withTableName(this.table) .withRecords(bufferedRecords); try { WriteRecordsResult writeRecordsResult = this.writeClient.writeRecords(writeRecordsRequest); LOG.debug("writeRecords Status: " + writeRecordsResult.getSdkHttpMetadata().getHttpStatusCode()); bufferedRecords.clear(); emptyListTimetamp = System.currentTimeMillis(); } catch (RejectedRecordsException e){ List rejectedRecords = e.getRejectedRecords(); LOG.warn("Rejected Records -> " + rejectedRecords.size()); for (int i = rejectedRecords.size()-1 ; i >= 0 ; i-- ) { LOG.warn("Discarding Malformed Record ->" + rejectedRecords.get(i).toString()); LOG.warn("Rejected Record Reason ->" + rejectedRecords.get(i).getReason()); bufferedRecords.remove(rejectedRecords.get(i).getRecordIndex()); } } catch (Exception e) { LOG.error("Error: " + e); } } } Now that we have examined the code, we can proceed with compiling and packaging for deployment to our Kinesis Data Analytics application. Enter the following commands to create a JAR file we will deploy to our Flink application: mvn clean compile -Dflink.version=1.11.2 mvn package -Dflink.version=1.11.2 Deploying, configuring, and running the application Now that the application is packaged, we can deploy it. Copy it to the S3 bucket created as part of the CloudFormation template (substitute with the OutputBucketName from the CloudFormation stack outputs): aws s3 cp /home/ec2-user/amazon-timestream-tools/integrations/flink_connector_with_upserts/target/timestreamsink-1.0-SNAPSHOT.jar s3:///timestreamsink/timestreamsink-1.0-SNAPSHOT.jar We can now configure the Kinesis Data Analytics application. On the Kinesis console, choose Analytics applications. You should see the Kinesis Data Analytics application created via the CloudFormation template. Choose the application and choose Configure. We now set the code location for the Flink application. For Amazon S3 bucket, choose the bucket created by the CloudFormation template. For Path to Amazon S3 object, enter timestreamsink/timestreamsink-1.0-SNAPSHOT.jar. Under Properties, expand the key-value pairs associated with FlinkApplicationProperties. The Kinesis data stream, Timestream database, and table parameters for your Flink application have been prepopulated by the CloudFormation template. Choose Update. With the code location set, we can now run the Kinesis Data Analytics application. Choose Run. Choose Run without snapshot. The application takes a few moments to start, but once it has started, we can see the application graph on the console. Now that the application is running, we can start pushing data to the Kinesis data stream. Generating and sending data To generate synthetic data to populate the Timestream table, you utilize a script from Amazon Timestream Tools. This Git repository contains sample applications, plugins, notebooks, data connectors, and adapters to help you get started with Timestream and enable you to use Timestream with other tools and services. The Timestream Tools repository has already been cloned to the EC2 instance. From your EC2 instance, navigate to the Timestream Tools Kinesis ingestor directory: cd /home/ec2-user/amazon-timestream-tools/tools/kinesis_ingestor The timestream_kinesis_data_gen.py script generates a continuous stream of EC2 instance data and sends this data as records to the Kinesis data stream. Note the percent-late and late-time parametersâthese send 5% of all records 75 seconds late, leading some aggregate values to be recomputed and upserted within Timestream: python3 timestream_kinesis_data_gen.py --percent-late 5 --late-time 75 --stream --region > writer.log & The Flink application starts to continuously read records from the Kinesis data stream and write them to your Timestream table. Non-aggregated records are written to Timestream immediately. Aggregated records are processed and written to Timestream every 5 minutes, with aggregates recomputed and upserted when late arriving data is generated by the Python script. Querying and visualizing your data We can now visualize the data coming in and explore it via the Timestream console. On the Timestream console, choose Query editor. For Database, choose your database. Run a quick query to verify weâre pushing both real-time and aggregated metrics: SELECT * FROM "InterpolatedBlogDB-"."EC2MetricTable-" WHERE measure_name IN ('cpu_idle', 'avg_cpu_idle') LIMIT 100 The following screenshot shows our output. In addition to simply querying the data, you can use QuickSight to analyze and publish data dashboards that contain your Timestream data. First, we need to ensure QuickSight has access to Timestream. Navigate to the QuickSight console. Choose your user name on the application bar and ensure youâre in the same Region as your Timestream database. Choose Manage QuickSight. Choose Security & permissions. Ensure that Timestream is listed under QuickSight access to AWS services. If Timestream is not listed, choose Add or remove to add Timestream. After you validate that QuickSight is configured to access Timestream, navigate to Datasets. Choose New dataset. Select Timestream. For Data source name, enter the Timestream database that was created as part of the CloudFormation template. Choose Create data source. For Database, choose the database created as part of the CloudFormation template. For Tables, select the table created as part of the CloudFormation template. Choose Select. In the following window, choose Visualize. When your data has loaded, choose Line chart under Visual types. Drag the time field to the X axis For Aggregate, choose Minute. For Value, choose measure_value::double. For Aggregate, select Average. For Color, choose measure_name. Because our Timestream data is structured as a narrow table, we need to apply filters to the dataset to select the measures of interest. On the navigation pane, choose Filter. Choose Create one. Choose measure_name. Choose the newly created filter and select avg_cpu_system and cpu_system. Choose Apply. The filter is now reflected in the visualization. Conclusion In this post, we demonstrated how to generate aggregations of streaming data, write streaming data and aggregations to Timestream, and generate visualizations of time series data using QuickSight. Through the use of Kinesis Data Streams and Kinesis Data Analytics, we deployed a data pipeline that ingests source data, and consumes this data into a Flink application that continuously reads and processes the streaming data and writes the raw and aggregated data to Timestream. We aggregated the data stream as itâs written to Timestream to reduce the amount of computation required when querying the data. With QuickSight, we rapidly developed visualizations from data stored in Timestream. We invite you to try this solution for your own use cases and to read the following posts, which contain further information and resources: What Is Amazon Kinesis Data Analytics for Apache Flink? Build and run streaming applications with Apache Flink and Amazon Kinesis Data Analytics for Java Applications Amazon Timestream Developer Guide: Amazon Kinesis Data Analytics for Apache Flink Amazon Timestream Developer Guide: Amazon QuickSight Amazon Timestream Tools and Samples Amazon Kinesis Analytics Taxi Consumer This post is part of a series describing techniques for collecting, aggregating, and streaming data to Timestream across a variety of use cases. This post focuses on challenges associated with late arriving data, partial aggregates, and upserts into Timestream. Stay tuned for the next post in the series where we focus on Grafana integration and a Kotlin based connector for Timestream. About the Authors Will Taff is a Data Lab Solutions Architect at AWS based in Anchorage, Alaska. Will helps customers design and build data, analytics, and machine learning solutions to meet their business needs. As part of the AWS Data Lab, he works with customers to rapidly develop and deploy scalable prototypes through joint engineering engagements.  John Gray is a Data Lab Solutions Architect at AWS based out of Seattle. In this role, John works with customers on their Analytics, Database and Machine Learning use cases, architects a solution to solve their business problems and helps them build a scalable prototype. https://aws.amazon.com/blogs/database/creating-amazon-timestream-interpolated-views-using-amazon-kinesis-data-analytics-for-apache-flink/
0 notes
Text
right/wrong/neither
I recommend listening to this song while reading; it helps me focus and it might help you too. :)
When I began our class in public art in sound and listening, my way of thinking was very much rooted in discernible outcomes and notions of success. This was largely a product of the environments I had existed in growing up: intense, competitive academic spaces, playing sports, going to a well-regarded college. Even in my first year at Brown, the notion of comparative success was pushed forth; I was denied entry to classes due to a comparatively worse portfolio, writing sample, or application. Not only were opportunities to learn limited, once in class, creative assignments I submitted were deemed poor in quality because they were not up to par with the level of the rest of the class or did not meet expectations of a rigid rubric imposed by the professor. I questioned why the system existed in the way that promoted uniformity and rewarded following rigid instructions over organic growth and learning.
Even at a place like Brown University where a liberal education is championed, I felt limited in my ability to make choices for myself, questioning my every decision and my place on campus. Why did every decision I made feel âwrongâ? Why did I constantly feel like I was in the âwrongâ place, doing the âwrongâ things? It was around this time of self-doubt that we read Miwon Kwonâs âThe Wrong Placeâ and Judith Halberstamâs The Queer Art of Failure. For a long time, it had been explained to me that the greatest growth and discovery was made when I failed, when things didnât work out, but I was still resistant. Halberstamâs writing expressed a similar sentiment in a way that spoke to me greatly. As Halberstam explains in the introduction, âFailure preserves some of the wondrous anarchy of childhood and disturbs the supposedly clean boundaries between adults and children, winners and losers,â and later points out that â[failure] provides the opportunity to use these negative affects to poke holes in the toxic positivity of contemporary lifeâ (Halberstam 3). Halberstamâs argument recognizes the importance of positivity but also the ability for negativity to shift our perspective and view things through a different or critical scope.
In a similar vane, Kwonâs writing recognizes that objective rights/wrongs are nonexistent, but our relationship to objects, beings, and places is what defines our sense of right and wrong: âThe determination of right and wrong is never derived from an innate quality of the object in question, even if some moral absolutes might seem to preside over the object. Rather, right and wrong are qualities that an object has in relation to something outside itself⊠The more important point here is that it is we who are wrong for this kind of ânewâ spaceâ (Kwon 38-39). Kwon explains that ending up in the âwrongâ place can often lead us to new discoveries about ourselves that we would miss if we follow rigid, âcorrectâ paths. I really love one of her closing statements in the piece:
âOften we are comforted by the thought that a place is ours, that we belong to it, perhaps even come from it, and therefore are tied to it in some fundamental way. Such places (ârightâ places) are thought to reaffirms our sense of self, reflecting back to us an unthreatening picture of a grounded identity. This kind of continuous relationship between a place and a person is what is deemed lost, and needed in contemporary society. In contrast, the wrong place is generally thought of as a place where one fells one does not belongâunfamiliar, disorienting, destabilizing, even threatening. This kind of stressful relationship to a place is, in turn, though to be detrimental to a subjectâs capacity to constitute a coherent sense of self and the worldâ (Kwon 42).
Kwon and Halberstamâs discussion of failure and place bring me to one of the first posts I made on our class soundblog, a podcast profiling the artist Emily A. Sprague, a founding member of the band Florist and an independent artists as well, working primarily in ambient music and creating with Eurorack modular synthesizers. Hailing from a rural community in the Catskill Mountains, Sprague explains how space has shaped her processes of creation: âEvery studio Iâve ever had has been in the place that Iâve been living in⊠You learn from that, being in spaces that arenât âStudio Bsâ⊠You just learn to work with what you haveâ (Sound + Process). On her origins, Sprague later explains, âCommunity has always been something that Iâve known to be incredibly hard to find and also the best and most rewarding and inspiring thing that you can experience. Iâm from a small town in a pretty rural area; I didnât really find people until I was older than I really felt a part of a community with, with making musicâ (Sound + Process). Like Halberstamâs argument, Sprague has repeatedly tried and experimented with space and technique, creating new ways to approach modular synth and pushing the boundaries of genre. Like Kwon explains, Sprague has made new discoveries in her process of making through the space sheâs inânot that place is right or wrong, but just that they are different, and produce a different result.
With her process of making rooted in modular synthesis, it is hard to deny Spragueâs precedents. On June 7th, 2017, Sprague made an Instagram post of a single book on a hardwood table: Daphne Oramâs An Individual Note of Music Sound, and Electronics. Daphne Oram, born in 1925 and passed away in 2003, was one of the central figures in the development of British experimental electronic music (Anomie Publishing). Oram declined a place at the Royal College of Music to become a music balancer at the BBC, and she went on to become the co-founder and first director of the BBC Radiophonic Workshop (Anomie). Leaving the BBC in 1959 to pursue commercial work in television, advertising, film and theater, Oram also made her own music for recording and performance, continuing her personal research into sound technology. Sound technology was a passion Oram cultivated since her childhood in rural Wiltshire (Anomie). Eventually her home in Kent became an unorthodox studio and workshop, which she crafted on a minimal budget (Anomie). Additionally, Oram developed her pioneering equipment, sounds, and ideas at her home studio. A significant part of her personal research was the invention of a machine that offered a new form of sound synthesis â the Oramics machine (Anomie). Her biography further cements her as influential to contemporary electronic artists, with Oramâs contribution to electronic music receiving considerable attention from new generations of composers, sound engineers, musicians, musicologists and music lovers around the world (Anomie).
Like Wendy Carlos, Oram was a pioneer of synthesizer music and technology, definitively changing the ways her contemporaries approached synthesis, as well as generations for years to come. It seems as though Carlos, Oram, and Sprague are inextricably linked. As Carlos focuses intently on her studio in her website/primary form of external communication, it is evident that the artist considers her studio as a point of pride and importance (Wendy Carlos website). If Wendy Carlosâs studio is Persian rugs, felines, and the crackle of a fireplace on a frigid winter day and Oramâs is a quiet converted oasthouse, then Spragueâs studio is a surfboard leaned against a corner next to a human-sized floor plant as sun pours in through a skylight on a warm California morning (Kheshti). Like Kheshtiâs relationship with Carlos, I feel connected to Sprague in a similar way. I do not mean to equate our relationships or interpolate myself in the discussion of electronic musicians, but I do find great joy in listening to electronic music and feel that it is an important part of my life, similar to the way Kheshti describes.
There is something extremely childlike, imaginative, and fantastical about home studios. They are places for experimentation and imagination, mostly unbounded by judgement or criticism, creating a place to take risks and make new discoveries. In many ways a home studio allows for a democratic education of sorts, a place where a creator can speak their own language and have internal dialogue, unrestricted by rigid constraints that may be imposed externally otherwise, and even explore the inherent fun in learning (hooks 43-44).
The ability for these artists to create in unexpected places and to push the boundaries of their genre and craft remind me of Fluxus artists like Yoko Ono or Alison Knowles. There is an ambiguity in place and correctness of a Fluxus score. They are not defined by doing things in a certain way or a certain place or for a certain outcome, but doing for the sake of doing, trying, experimenting, learning, and moving forward. I recently watched a film that referenced Yoko Onoâs âCeiling Painting, Yes Paintingâ (1966), where the person interacting climbs a ladder to a magnifying glass in order to discern a tiny speck on the ceiling that reads âYESâ (Guggenheim Bilbao). I think this piece is beautifully poetic in a number of ways, but specifically for its affirmation in discovery, and doing so in a playful, almost childlike and imaginative manner. On this note, I want to include some scores I wrote throughout the course of the semester for consideration, reflection, and response (dots indicate separate scores):
sit on a bench and be the last to break eye contact with a stranger âą collect fallen leaves from the ground into a paper bag and deliver to someone âą learn the language of a Tree and have a conversation âą ask a loved one (or a complete stranger) to name a favorite song and listen to it in full âą listen to your breath as you run up a steep hill and walk down slowly; listen to your breath as you walk up a steep hill and run down slowly âą cut holes in an umbrella during a rainstorm and listen to the irony pour through âą get a bicycle and ride across America âą hold your palms and fingers gently over the tips of grass at dawn and wipe the dew across your cheeks âą do nothing âą sitting cross-legged on the floor, recount in detail to an audience (of any or no size) the most recent dream that you can remember âą make a friend âą look at the Atlantic Ocean; turn 180 degrees; walk; look at the Pacific Ocean âą grab a cactus / smash a guitar âą move fast so that wind becomes music
Through all these artists, authors, activists, and beyond, like Ono, Knowles, Carlos, Oram, Halberstam, Kwon, hooks, Kheshti, it is clear that approaching things not with notions of right or wrong, but with the intention of discovery, experimentation, and playful imagination is a valuable way of living. In the inscription to hookâs Teaching Community, the author quotes Paulo Freire: âIt is imperative that we maintain hope even when the harshness of reality may suggest the opposite.â In many ways, these figures stand for just that: a rejection of the harshness of reality through creativity, experimentation, discovery, and a love for learning.
Bibliography
âCeiling Painting, Yes Painting (1966).â Guggenheim Bilbao, http://yokoono.guggenheim-bilbao.eus/en/artworks/ceiling-painting-yes-painting.html.
âDaphne Oram â An Individual Note of Music, Sound and Electronics.â Anomie Publishing, Anomie Publishing and The Daphne Oram Trust.
âEmily Sprague: SOUND PROCESS #8.â SoundCloud, 2017, https://soundcloud.com/sound-and-process/es_ep8.
Halberstam, Judith. The Queer Art of Failure. Duke University Press, 2011.
Hooks, Bell. Teaching Community. Routledge, 2003.
Kheshti, Roshanak. Swithced-on Bach. Bloomsbury Academic, 2019.
Kwon, Miwon. âThe Wrong Place.â Art Journal, vol. 59, no. 1, 2000, pp. 33â43. JSTOR, doi:10.2307/778080.
âWendy Carlos.â Wendy Carlos, http://www.wendycarlos.com/.
0 notes
Text
300+ TOP AUTOCAD Objective Questions and Answers
AUTOCAD Multiple Choice Questions :-
1. You have just issued the HIDE command on a new 3D House Plan to see how the design looks, but find that all of the interior room labels are showing through your opaque walls, ruining the 3D effect. What can you do with the TEXT objects to fix this? A. Assign a thickness of 0.001" to all TEXT. B. Place the TEXT on the Defpoints layer. C. Place the TEXT on a no plot layer. D. Assign the Hide property to all TEXT. E. Assign a thickness of 12" to all TEXT. Answer:- A 2. Which one of the following AutoCAD objects can NOT have a 3D (Z) thickness property applied to it? A. TEXT B. MTEXT C. LINE D. CIRCLE E. PLINE Answer:- B 3. You want to draw an octagon shape window on a vertical wall face in your 3D House Plan. Which UCS command option is best suited to place your User Coordinate System on this front wall face to ensure the window is placed flat on this wall? A. Origin B. ZAxis C. 3point D. View E. X Answer:- C 4. Which AutoCAD command will determine precisely the volume of a complex (or simple) 3D Solid part? A. AREA B. MASSPROP C. VOLUME D. PART E. CALCULATE Answer:- B 5. The purpose of the UCSICON command is to:? A. Permit a user to move the UCS system. B. Permit a user to redefine the location of 0,0. C. Permit a user to control the placement and appearance of the icon to help you understand your orientation in the XYZ coordinate system. D. Provide a working plane to draw on. E. Change to the desired working units from decimal to feet and inches. Answer:- C 6. The primary purpose of the UCS (User Coordinate System) is? A. Helps a user to calculate the area of an object. B. To determine angles for isometric projections. C. Allows a user to select the desired type of drawing units. D. Allows a user to draw on a specified 2D Plane in 3D space. E. Allows a user to define a place to put icons. Answer:- D 7. To determine if two parts (3D Solids) will fit together without any interference, what 3D AutoCAD command would you use? A. FIT B. UNION C. SUBTRACT D. INTERSECT E. INTERFERE Answer:- E 8. What is the volume of a cone with a 12' radius and a 12' height? A. 3,216,915.13 cu in B. 3,162,915.13 cu in C. 3,126,195.13 cu in D. 3,126,951.13 cu in E. 3,126,915.13 cu in Answer:- E 9. You apply a property called Thickness to a standard AutoCAD CIRCLE. What type of AutoCAD object do you now have officially? A. Surface Model (Polygon Mesh) B. Solid Model (3D Solid) C. PLINE D. Cylinder E. Region Answer:- A 10. Which of these objects will only draw flat on the current XY working plane and not allow 3D coordinate input (e.g. Z input)? (Check ALL that appy) A. LINE B. DONUT C. CIRCLE D. PLINE E. 3DPOLY Answer:- B,C,D
AUTOCAD MCQs 11. You have just created an interesting new round object by using REVSURF to revolve a complex curve around an axis. However you find there are only 6 faces around the perimeter leaving it very rough looking. What variable do you change to correct this? A. Set SURFTAB2 to 18 B. Set TABSURF1 to 18 C. Set SPLINESEGS to 12 D. Set TABSURF2 to 18 E. Set SURFTAB1 to 18 Answer:- E 12. The axis used to show depth in AutoCADŸ is the A. W axis. B. X axis. C. Y axis. D. Z axis. Answer:- D 13. The right-hand rule is used to determine the direction of the A. positive X axis. B. negative Y axis. C. positive Z axis. D. negative X axis. Answer:- C 14. All of the axes in the 3D coordinate system meet at A) 60° angles. B) 90° angles. C) 120° angles. D) 135° angles. Answer:- B 15. In AutoCADŸ, all objects are drawn on the A) YZ plane. B) XZ plane. C) XY plane. D) ZX plane. Answer:- C 16. The UCS command A) allows you to manipulate the user coordinate system. B) controls display of the UCS icon. C) defines the world coordinate system. D) allows you to draw in the YZ plane. Answer:- A 17. The World option of the UCS command A) returns to the world coordinate system. B) returns to the previous user coordinate system. C) defines a new world coordinate system. D) aligns a new coordinate system parallel to the screen. Answer:- A 18. To allow you to view 3D models from different points of view, AutoCADŸ provides A) three predefined 3D views. B) six predefined 3D views. C) ten predefined 3D views. D) twelve predefined 3D views. Answer:- C 19. The VIEW command does all of the following except A) provide access to AutoCADŸ's predefined views. B) provide access to custom views. C) create a new custom view. D) display the View toolbar. Answer:- D 20. To view a model from any position in 3D space, use the A) VIEW command. B) 3DZOOM command. C) 3DORBIT command. D) PAN command. Answer:- C 21. The View option of the UCS command A) aligns a new UCS parallel to the screen. B) aligns a new UCS to the selected face of an object. C) moves the origin of the current UCS to the lower left corner of the screen. D) applies the current UCS to a specified viewport. Answer:- A 22. A wireframe model of a titanium block would most closely resemble a(n) A) ice cube. B) empty cardboard box. C) box-shaped wire basket. D) book. Answer:- C 23. Wireframe drawings can be used to A) show and define the sides of an object. B) present design ideas to potential clients. C) calculate volume. D) determine spatial relationships between objects. Answer:- D 24. If you were on one side of a wireframe cube looking toward a solid sphere on the other side of the cube, you would see A) only the wireframe. B) the solid sphere and the wireframe, because the wireframe has no sides. C) the outline of the sphere through the translucent walls of the wireframe. D) the solid sphere only, because the wireframe is transparent. Answer:- B 25. Of the following, the object best suited for a wireframe model is A) the hull of a speedboat. B) a refrigerator. C) a wire shelving unit. D) a laptop computer. Answer:- C AUTOCAD Objective type Questions with Answers 26. To edit a wireframe model, use A) grips. B) the WIREDIT command. C) the SPLINEDIT command. D) the WEDIT command. Answer:- A 27. A wireframe model is created using A) 2D objects with the EXTRUDE command. B) 2D objects such as points, lines, and circles. C) 3D objects such as spheres and cones. D) 3D commands such as REVOLVE and TABSURF. Answer:- B 28. One disadvantage of using wireframes as opposed to surface or solid models is that A) they take longer to create. B) they take up more space on the storage medium. C) they corrupt more easily. D) they can be difficult to understand because they are "stick figures." Answer:- D 29. The main difference between creating wireframe models and creating 2D objects is that A) 2D objects cannot be extruded. B) wireframe objects cannot be edited using 2D editing commands. C) 2D objects are created on two or more planes. D) wireframe geometry is created using the X, Y, and Z axes. Answer:- D 30. To avoid surprises when you create a wireframe model, it is a good idea to A) avoid using object snaps. B) stay in the default world coordinate system. C) specify coordinates using the absolute technique. D) view the model from the top (plan) view while working. Answer:- C 31. Compared with surface and solid models, wireframe models take A) about the same amount of time to create. B) less time to create. C) more time to create. D) about 30 minutes to create. Answer:- A 32. Surface models are considered to be more advanced than wireframe models because A) they can be used to calculate mass properties. B) they have surfaces (faces). C) they have volume. D) they can be used to calculate holding properties and weights. Answer:- B 33. An analytic surface is one that A) is described by an equation or set of equations. B) is formed from a set of data points that are used as control points. C) interpolates the input data, allowing control of individual data points. D) is created using a minimum of four curves that have adjoining edges. Answer:- A 34. Surface models are used mostly in A) modeling mechanical parts. B) creating architectural models. C) modeling complex electrical systems. D) modeling complex curves in boat hulls and car fenders. Answer:- D 35. The number of segments in a tabulated surface is controlled by the A) SURFTAB1 system variable. B) SURFTAB2 system variable. C) SURFTYPE system variable. D) EDGESURF command. Answer:- A 36. A fillet surface is used to A) reshape a bicubic Coons patch with smooth curves. B) round the edges of a planar surface. C) blend two surfaces together. D) convert facets into smooth curves. Answer:- C 37. A surface primitive is a(n) A) basic 3D object that can be used to create more advanced models. B) wireframe model from which surface models can be created. C) 2D object that can be extruded to form a surface model. D) analytic surface from which nonanalytic surfaces can be created. Answer:- A 38. All of the following are surface primitives except the A) box. B) cone. C) spiral. D) dish. Answer:- C 39. To create an approximation of a Coons patch in AutoCADŸ, use the A) 3DMESH command. B) REVSURF command. C) TABSURF command. D) EDGESURF command. Answer:- D 40. The Spline option of the PEDIT command A) creates an arc-fit polyline joining each pair of vertices to convert a straight-line polyline into a curve. B) uses the vertices of the polyline as control points to create a curve approximating a B-spline. C) generates the linetype in a continuous pattern through the vertices of the polyline. D) smoothes a polygon mesh surface. Answer:- B 41. To control the type of surface created using the Smooth surface option of the PEDIT command, use the A) SURFTAB1 system variable. B) SURFTAB2 system variable. C) SURFTYPE system variable. D) EDGESURF command. Answer:- C 42. Solid models are used more frequently than surface models or wireframes because they A) result in smaller files. B) are more accurate. C) can be viewed from any point in 3D space. D) allow you to calculate mass properties. Answer:- D 43. The disadvantage of solids created using pure primitive instancing is that A) they are less accurate than solids created using cell decomposition. B) they must be converted to another type of solid before they can be edited. C) they are created using the B-rep format. D) some of the parameters of the solids are not specified. Answer:- B 44. By default, AutoCADŸ displays all kinds of models in wireframe format. To find out which type of solid a specific model is, use the A) ID command. B) REGEN command. C) LIST command. D) MODELTYPE system variable. Answer:- C 45. Spatial occupancy enumeration is rarely used in CAD applications because it A) requires a large amount of memory. B) is not sufficiently accurate for CAD work. C) requires an obscure operating system. D) does not allow 3D solids to be edited. Answer:- A 46. To store and display modeling information, AutoCADŸ uses A) cell decomposition. B) boundary representations and sweeping. C) constructive solid geometry. D) pure primitive instancing. Answer:- B 47. All of the following are examples of solid primitives except the A) sphere. B) cone. C) dome. D) pyramid. Answer:- C 48. The ISOLINES system variable specifies A) the number of segments in a tabulated model. B) the number of contour lines that define a solid model. C) the direction of contour lines in a solid model. D) the length of the segments in an extrusion. Answer:- B 49. All of the following objects can be extruded except a(n) A) arc. B) polyline. C) ellipse. D) region. Answer:- A 50. To create a revolved solid, use the A) EDGESURF command. B) REVSURF command. C) REVOLVE command. D) EXTRUDE command. Answer:- C AUTOCAD Questions and Answers pdf Download Read the full article
0 notes
Text
Pokémon Moon, Epilogue: Responsibility
Beneath the crystal dome at the summit of Mount Lanakila, all is serene; all is peaceful.  The dome sparkles in the midday sun, the air is still, pure white clouds drift softly past the mountain below, and the inlaid PokĂ©ball design on my throne pulses gently with a warm azure light. ââŠsweet Arceus, Iâm SO BORED!â
Around one oâclock in the afternoon, my scheduled challenger shows up. âFinally,â I complain.  âWhat took you so long?â  Hau rubs the back of his neck in embarrassment. âEheh⊠Gramps pulled some new move we hadnât seen before.  Almost didnât get past him this time.  Raichu came through for me, though.â  The electric mouse PokĂ©mon hovering at his side crackles at him affectionately, and he pats it on the head. âAnd what did you learn?â I ask expectantly.  Hau sighs. âEven Kahunas are always learning and perfecting new techniques,â he replies in the sing-song voice of a bored child reciting the moral of a story. âGood.  Now⊠shall we?â  He grins. âIâve made it this far and Iâm not done yet!  After all, thereâs still a trainer standing right in front of me that Iâve got to overcome!â  I grin back. âThatâs the spirit!  Now hurry up and get me out of this mess!â
I suppose I didnât really expect him to win this time.  Hauâs progress has been impressive, I have to admit, but I still donât think he understands the gravity of what heâs asking for.  As if to symbolise the new level he achieved in defeating his grandfather, heâs added a Crabominable â Halaâs signature PokĂ©mon â to his existing team of Raichu, Flareon, Komala and Primarina.  There is a nagging inelegance in his style, his PokĂ©mon have not perfected all of their techniques, and there is still the barest hint of hesitation between his thoughts and their actions.  I could lecture him about all this, but the fact is, heâs already on the path; he and his PokĂ©mon are going to keep advancing regardless of what I do.  Every time we battle, Hau comes a little closer.  For now, though, heâs still a long way off, and his Primarinaâs valiant last stand against my Decidueye ends in defeat.
âSeriously!?â Hau blurts after we have recalled our PokĂ©mon.  âThis is really frustrating, you know?  I seriously tried my hardest!â  I smile at him. âI should hope so; I donât want to be Champion of this backwater one day longer than I have to!  You are getting better; I promise Iâm not just saying that.â âYeah, yeah, I know.â  He returns my smile.  âAnd at least I had a blast going all out against you.â âLikewise!  NowâŠâ  I wander to the edge of the arena platform and sit down, legs dangling over the edge, then pat the ground next to me.  âCome on; time for this weekâs lesson.â  Hau obediently sits down alongside me.  A thought strikes him. âSay, dâyou do this after all the challenges you get?â âNo.  Just with you.â  He looks at me in surprise.  âSooner or later, Hau⊠maybe sooner than either of us realises⊠thatâs going to be your chair.  I want you to be ready.  I owe you that.â  Hau shakes his head. âThis is so heavy, manâŠâ âBeing the Champion is.â âSooo⊠what are this weekâs words of wisdom?â  I stare out at the crystal dome in silence for a while. âWeâve tangled with some more Ultra Beasts since the last time I saw you.â âWith that⊠international police guy, right?  Look-?â âDonât say it!â I cut him off.  âDonât you dare say that smug prickâs ridiculous code name; I refuse to dignify it by using it.  Also, I think he can hear whenever anyone in the world says it.â  I glance shiftily from side to side to stress my point. âUhâŠâ Hau hesitates, confused.  âWhat am I sâposed to call him?â ââŠI have a few suggestions, but I think Iâd get in trouble with Hala if you ever repeated any of them.  Mr. L will do, I suppose.â  I frown thoughtfully.  âWatch out for that one.  He likes to treat Champions as his own unpaid interns.â âIâll⊠remember that.â âSee that you do.  Anyway, yes.  We went on several more missions for him and his superior, hunting more of the Beasts that appeared when Lusamine opened that wormhole at the Aether Paradise.â  Hau shudders. âMan, those things give me the creeps.â âThat makes two of us.â âIt wasnât more of those scary floating mind-controlling ones, was it?  Just seeing one of them was enough for a lifetime!â âNo⊠no, different species.  A humanoid insect faster than the eye⊠a bundle of raw nerve cords full of electricity⊠some kind of biomechanical rocket⊠and a huge, devouring mouth, consuming everything around it.â  Hau cocks his head, trying to picture them and clearly failing.  âI⊠could bring one for you to meet, but Iâm not entirely certain that would be safe.â âOh, no no no, thatâs fine!â he replies hurriedly, then realises the significance of what I just said.  âWait⊠you could bring one?â âThatâs right.  My âassignmentâ was to capture all of the rogue Ultra Beasts in order to prevent them from causing any harm to Alolaâs fragile environment.  So thatâs what I did.â âCapture!?  Well- well, youâre gonna put them back, right?  Back⊠yâknow.  Where they came from?â âThat⊠may not be so straightforward.  The technology Lusamine used to open her Ultra Wormhole is useless without Nebby, and he seems⊠reluctant to open another one into Ultra Space.  Iâm not sure I can return them to their own dimension.â âWh- what are you gonna do with them, then?â âFunnily enough, Interpol had no specific plan for that stage of the operation.  Whatâs more, all the accounts of their previous responses to situations like this are supposedly classified⊠although, frankly, I doubt they would be much help.â  A thought occurs to me.  âWhat would you do, Hau?â I ask pointedly. âMe?  Whyâs that matter?â âBecause the next time this happens, it might be on your watch.â  His eyes widen in panic.  âTake your time.  Try to cover all the angles.  What choices can you think of?â âUmâŠâ Hau hesitates.  âWell, we canât let them go, can we?  That just puts us back where we started, with a bunch of Ultra Beasts running wild in Alola.â âMmm.  Quite.â âButâŠâ he continues, thinking out loud.  âMaybe⊠somewhere else?  Couldnât there be some other region really far away⊠somewhere they could live in peace?â âInteresting idea.  I did ask Wicke about that â with Lusamine more or less out of commission, sheâs the closest thing any of us have to an expert on the Ultra Beastsâ behaviour and physiology.  Sheâs looking into it, but sheâs not optimistic.  Their metabolisms are wildly inefficient under typical Earth conditions, so they have to eat ravenously just to keep their powers from fading, which wreaks havoc on the local food chains, and the unfamiliar environment provokes a near-constant fight-or-flight response that places them under incredible metabolic strainâŠâ ââŠhuh?â Hau asks in befuddlement.  I remember Iâm not supposed to be teaching him PokĂ©mon ecology. âUh⊠the point is⊠we havenât found any good options.  Not on Earth, anyway â and at the moment, Earth is the only available planet.â  I shrug. âCould you build a habitat?  The Aether Foundation had that awesome indoor PokĂ©mon preserve, yeah?  They could build another one for the Ultra Beasts, to be like their home!â âMayyyyybeâŠâ I say hesitantly.  âAt the moment, Gladion is the acting President of the Aether Foundation, and his partner PokĂ©mon was born to be a âBeast-Killerâ⊠so⊠Iâm not so sure how enthusiastic heâll be about devoting most of the Foundationâs remaining manpower and resources to building a reservation for the Ultra Beasts.â âAue, youâre right!  I wouldnât wanna have that talk!â âI mean, Iâm definitely going to talk to him about it!â I add quickly.  âItâs just⊠going to take some persuasive nuance.  That, and thereâs still so much we donât know about Ultra Space.  Itâll take time to gather data â but we have Professor Burnet, and we have Wicke and the Aether Foundation.  In the meantime, thoughâŠâ I trail off and shrug again. âI⊠I guess you could put them in stasis, right?  Like all those PokĂ©mon Ms. Lusamine had in her labâŠ?â  He shivers at the thought.  âYou couldnât leave them in there permanently⊠I donât even like the thought of doing it at all.  But at least it would give you more time to find a new place for them to live, or a way to send them home.â  I nod. âAlways be suspicious of âtemporaryâ solutions to hard problems.  They have a tendency to become permanent when no-oneâs looking.  But yes, thatâs a decent idea.â  Hau nods, and thinks in silence for another minute. âYou said you⊠captured them, yeah?  Like PokĂ©mon, in PokĂ©balls?â he asks. âThatâs right.  Using the Beast Balls that Lusamine developed to catch Nihilego.â âSo the Ultra Beasts are PokĂ©mon then?â  I pause. âHmm.  What do you think?â âUh⊠well⊠they sure arenât human, right?  So⊠they must be?â  He thinks itâs a trick question.  Canât blame him; I ask Hau a lot of trick questions. âNot every living thing that isnât human is a PokĂ©mon,â I remind him.  âThink of the plants, think of bacteria⊠the PokĂ©rus, whatever that isâŠâ  I reach into my backpack, pull out an empty PokĂ©ball, and turn it over in my hands pensively.  âAccording to Wicke, ordinary PokĂ©balls have trouble recognising the Ultra Beasts as PokĂ©mon.  Their physiology is too different.  Thatâs why Lusamine ordered the Foundation to create the Beast Balls.â âBut you can catch them, right?  And they use PokĂ©mon moves, and they have PokĂ©mon types.â âYeah, but should we define what is or isnât a PokĂ©mon based on the relationship it has with humans, and how it fights our battles?  Thatâs kinda self-centred of us, isnât it?â  Hau hesitates, but then shakes his head decisively. âPokĂ©mon training isnât just about what PokĂ©mon do for humans!  Itâs about having fun, and being friends, and eating malasadas together on the beach!  Youâve gotta reach out to them, give âem a chance to make friends!  Maybe then this world wonât be so scary, and they can live here and be happy!â  I chuckle. âEver the ray of sunshine, arenât you?â I say drily.  âFriendship alone canât overcome the hardships of living in an alien dimension, Hau.  But I suppose it could make it easier.  And much as I hate to admit it, if anyoneâs going to try it, I probably have the best chance.â âWhat, because youâre such an epic trainer?â Hau asks, a hint of scepticism in his voice. âHeh.  I wish.  No, itâs⊠something else.  Because of our⊠unique experiences⊠Lillie, B, Guzma, Lusamine and I are something that Interpol calls âFallersâ â people whoâve spent time in Ultra Space and lived to tell the tale.  We have what you might call the âscentâ of the Ultra Wormholes on us, something that the Beasts can sense.  They think weâre a way home.â  Hau looks baffled at first, but then I see a lightbulb go on in his head. âIf you remind them of home, you can keep them calm!  You can help them adjust to living on Earth!â âHypothetically, yes.  Itâs worth a try, anyway.  I havenât actually attempted to communicate with any of the Ultra Beasts in my⊠care⊠just yet.â âWhatâs the worst that could happen?â Hau asks, with the cavalier attitude of someone who has absolutely no idea what the worst that could happen is. âIâm not sure,â I admit.  âBut⊠there was the matter of the Interpol task forceâs leader.  The Hoennese woman.â  The change in subject makes his eyebrows go up in confusion. âOh, right â you said you thought you knew her from somewhere?  Did you find out anything else?â  I shrug. âWell, it was her, all right.  Anabel, Salon Maiden of the Battle Tower, at one point the third most powerful trainer in Hoenn, and rumoured to be a gifted psychic.  Only she disappeared years ago, and as far as I know sheâs still listed in her home region as a missing person.â  Hau wrinkles his nose. âDid she run away to join the International Police?  Fake her death or something?â âNot a bad guess, but no.  The truth is⊠stranger.  Apparently, she is also a Faller.  She disappeared when she ran afoul of an Ultra Wormhole, and was found by Interpol agents on Poni Island, suffering from amnesia.  They figured out what had happened to her when they detected the radiation signature of the Wormholes on her body⊠but they never told her that.  She has no idea what happened to her.â âThat doesnât sound right at all!  How can they keep that from her!?â âWellâŠâ I begin hesitantly.  âIâd like to say itâs because her own Interpol superiors are concerned about bringing back her traumatic memories, or something⊠but to be honest, I think they just consider her more useful as an operative specialising in Ultra Beasts if she doesnât know about her status as a Faller.  Easier to use her as bait.â  Noticing a look of horror on Hauâs face, I explain further.  âThese are not nice people, Hau.  Theyâre on our side, and theyâre miraculously competent, with⊠a certain notable exception.  But they are every bit as ruthless as Lusamineâs Aether Foundation, and they will use you if you let them.â âAue⊠this is making my head spin,â he complains.  âAnd you still havenât explained what Anabel has to do with you trying to train the Ultra Beasts.â âOh!  Right.  Yes.  Well⊠The way the Ultra Beasts react to Anabel, as a Faller, makes me⊠more than a little nervous,â I admit.  âThey were extremely aggressive towards her â as if they thought she was guarding or blocking their way home.â  Hauâs face falls. âSo maybe theyâll blame you for them being stuck in Alola, eh?â âAnd react accordingly, yes.  And thereâs something else, a sort of⊠mental fatigue that Anabel seems to suffer from being around them.  It could just be to do with the length of time she spent in Ultra Space, or even a result of her natural psychic potential, but considering the way Nihilego was able to influence LusamineâŠâ I realise that Iâm speculating more wildly than usual, and shake my head clear.  âI donât know, but like I said, the idea makes me⊠nervous about talking to them.â âSo trying to train them like PokĂ©mon isnât a sure thing eitherâŠâ he decides, summing up the last minute or so of conversation.  I let the silence linger for a while, hesitating to bring up my last point. âThereâs another possibility youâre not suggesting.â  Hau just stares in confusion. âI give up, cousin; whatâs the other option?â  I look away, close my eyes, and sigh. âDestroy them.â  He stifles a gasp. âWhat!?  No!  We canât!â âAnd why is that?â âTheyâre still living things, right?  And theyâre far away from home and prolly scared of us as we are of them!  Itâs not fair!â âAll true.  But theyâre also dangerous, powerful entities that donât belong in this world, have already done noticeable damage to every Alolan ecosystem theyâve touched, and could kill Arceus knows how many PokĂ©mon and humans if theyâre left to roam.  What if we canât contain them?  What if we canât teach them to live here in peace?â âMaybe we canât, but we have to try!â he protests. âAnd if weâre wrong, or if we fail?  Who bears responsibility for what the Ultra Beasts do next? âI- I donât know, but-!  But itâs not their fault either!  Lusamine brought them here; they didnât ask for that!â âAnd the people and PokĂ©mon of Alola didnât ask to be put in the path of a confused and angry extradimensional monster with the powers of a god!  Are you ready to take responsibility for that happening?â âI- no, but- but we canât kill them just for being lost in our world!â âAnd would you really stand by that, if your home were in their path!?â âI- I- IâŠâ  Heâs visibly upset; his eyes are starting to water a little, and he turns away and wipes his sleeve over his face.  I try to soften my expression, and gently rest a hand on his shoulder.  Hau blinks and looks back at me. âItâs that sort of thing, Hau,â I tell him softly, âthat will make you a better Champion than me.â  He blinks, puzzled.  âI considered it.  Interpol considered it.  You wouldnât.â âYou wouldnât⊠like⊠really do it though, would you?â Hau asks.  I pause. ââŠI might have.  If the other choices had all proven⊠infeasible.â  I shake my head.  âDo you understand now why I didnât want this job?â  Hau looks out at the sky past the crystal dome. ââŠI think so.â âDo you still want this job?â  He says nothing for a long time. âYeah.  Yeah, I do.  I wanna make a difference in peopleâs lives.  I wanna protect people and keep Alola safe and happy.  And⊠I wanna be strong.  Strong enough to do whatâs right when it matters⊠and strong enough to admit when Iâm weak.â  I close my eyes and smile.  I think heâs starting to get it.  It wonât be long now. âResponsibility,â I say, leaving the word hanging in the air for a few moments.  âWhen you have power, no choice is easy.  You can cross terrible lines to protect the people and places you love, or you can risk everything and everyone in the name of doing whatâs right.  Itâs not up to me to tell you whatâs right, but either way, you have to own your choices, because itâs those choices that make you who you are.  When you hide from your responsibility, you hide from yourself.â  Hau frowns. âWhat does that make you?â he asks.  I laugh. âA coward.  Always have been.â  We sit together for a while, watching the clouds drift by.
âAnyway,â I say, breaking the silence.  âThatâs enough about me.  How about what youâve been doing?  Youâre still helping Hala retrain the Team Skull grunts, arenât you?  How are they?â âTheyâre all doing really well!  Theyâre taking to this whole rescue team deal way better than I ever woulda thought for such a bunch of troublemakers.  We ran a practice mission out on Ten Carat Hill the other day and they totally aced it!  Cool under pressure, and awesome team spirit!  I guess they just needed someone to give them a second chance!  One of them in particular.â âOh?  Whoâs that?â âOh, you know, no one special.  Just your boyfriend,â Hau says teasingly. âHeâs not my boyfriend,â I tell him crossly.  âI owe him a drink, thatâs all.â âReally?  âAâ drink?â Hau asks sceptically. âWell⊠okay, maybe I owe him two or three drinks, after my coronation partyâŠâ âTry six or seven!â âI guess things got a little out of handâŠâ I concede. âA little?  You challenged Tapu Koko to a fistfight and broke every bone in your hand trying to punch out one of the old basalt idols!â  Hau laughs.  âThat was the best part of our whole journey, hands down!â âYeah, yeahâŠâ I grumble. âAnd the way B fussed over you for days afterwardsâŠâ âYeah, yeah, whatever.â âDid you know he even asked my gramps to teach him first aid?â âYou are making that up.â âTrainerâs honour!â he declares, putting his hand on his heart.  âIt was actually really cute.â  I roll my eyes, sigh in exasperation, and try to give him another surly grumble, but end up smiling in spite of myself. âWell, should I pencil you in for next week?  The usual time?â âYou bet!â Hau replies.  âSo watch out â me and me awesome team are definitely winning the next one!â  I smile at him, and offer my hand to shake. âYouâre this regionâs future, Hau â may the Tapu help us.â  He takes my hand. âI wonât let you down!â And the funny thing is⊠I believe him.
24 notes
·
View notes