#Remote Access Connection Manager
Explore tagged Tumblr posts
richardmhicks · 17 days ago
Text
Always On VPN Security Updates June 2025
Patch Tuesday is upon us again; thankfully, it’s a light month of Always On VPN administrators. The Microsoft monthly security updates for June 2025 include just a few Windows Routing and Remote Access Service (RRAS) fixes. In addition, an update is available for a vulnerability in the Windows Remote Access Connection Manager. Significantly, DirectAccess administrators are affected this month by…
0 notes
michaelbautista5434 · 6 months ago
Text
Smart Home Innovation with SwitchBot: Secure Your Home with a Door Lock and Passcode
The Smart Home Industry is rapidly evolving, and SwitchBot is at the forefront of this exciting transformation. One of the standout products in their lineup is the innovative door lock with passcode functionality. This advanced smart lock not only enhances security but also provides unparalleled convenience for homeowners.
SwitchBot’s door lock with passcode allows you to manage access to your home effortlessly. With the ability to set unique passcodes for family members, friends, or service providers, you can ensure that only trusted individuals can enter your home. This feature is particularly useful for those who frequently have guests or require maintenance services.
In addition to its impressive security features, the door lock integrates seamlessly with other smart home devices, allowing for a cohesive smart home experience. You can control the lock remotely, receive notifications when someone enters or leaves, and even set temporary passcodes that expire after a certain period, providing peace of mind and flexibility.
SwitchBot is committed to making your home smarter and safer. With their user-friendly products and innovative solutions, you can easily transform your living space into a modern, secure environment. Embrace the future of home security with SwitchBot’s door lock and passcode, and enjoy the benefits of a connected home today!
Explore more about SwitchBot and their smart home solutions:
SwitchBot Official Website
SwitchBot Smart Home Products
0 notes
govindhtech · 2 years ago
Text
Tech Breakdown: What Is a SuperNIC? Get the Inside Scoop!
Tumblr media
The most recent development in the rapidly evolving digital realm is generative AI. A relatively new phrase, SuperNIC, is one of the revolutionary inventions that makes it feasible.
Describe a SuperNIC
On order to accelerate hyperscale AI workloads on Ethernet-based clouds, a new family of network accelerators called SuperNIC was created. With remote direct memory access (RDMA) over converged Ethernet (RoCE) technology, it offers extremely rapid network connectivity for GPU-to-GPU communication, with throughputs of up to 400Gb/s.
SuperNICs incorporate the following special qualities:
Ensuring that data packets are received and processed in the same sequence as they were originally delivered through high-speed packet reordering. This keeps the data flow’s sequential integrity intact.
In order to regulate and prevent congestion in AI networks, advanced congestion management uses network-aware algorithms and real-time telemetry data.
In AI cloud data centers, programmable computation on the input/output (I/O) channel facilitates network architecture adaptation and extension.
Low-profile, power-efficient architecture that effectively handles AI workloads under power-constrained budgets.
Optimization for full-stack AI, encompassing system software, communication libraries, application frameworks, networking, computing, and storage.
Recently, NVIDIA revealed the first SuperNIC in the world designed specifically for AI computing, built on the BlueField-3 networking architecture. It is a component of the NVIDIA Spectrum-X platform, which allows for smooth integration with the Ethernet switch system Spectrum-4.
The NVIDIA Spectrum-4 switch system and BlueField-3 SuperNIC work together to provide an accelerated computing fabric that is optimized for AI applications. Spectrum-X outperforms conventional Ethernet settings by continuously delivering high levels of network efficiency.
Yael Shenhav, vice president of DPU and NIC products at NVIDIA, stated, “In a world where AI is driving the next wave of technological innovation, the BlueField-3 SuperNIC is a vital cog in the machinery.” “SuperNICs are essential components for enabling the future of AI computing because they guarantee that your AI workloads are executed with efficiency and speed.”
The Changing Environment of Networking and AI
Large language models and generative AI are causing a seismic change in the area of artificial intelligence. These potent technologies have opened up new avenues and made it possible for computers to perform new functions.
GPU-accelerated computing plays a critical role in the development of AI by processing massive amounts of data, training huge AI models, and enabling real-time inference. While this increased computing capacity has created opportunities, Ethernet cloud networks have also been put to the test.
The internet’s foundational technology, traditional Ethernet, was designed to link loosely connected applications and provide wide compatibility. The complex computational requirements of contemporary AI workloads, which include quickly transferring large amounts of data, closely linked parallel processing, and unusual communication patterns all of which call for optimal network connectivity were not intended for it.
Basic network interface cards (NICs) were created with interoperability, universal data transfer, and general-purpose computing in mind. They were never intended to handle the special difficulties brought on by the high processing demands of AI applications.
The necessary characteristics and capabilities for effective data transmission, low latency, and the predictable performance required for AI activities are absent from standard NICs. In contrast, SuperNICs are designed specifically for contemporary AI workloads.
Benefits of SuperNICs in AI Computing Environments
Data processing units (DPUs) are capable of high throughput, low latency network connectivity, and many other sophisticated characteristics. DPUs have become more and more common in the field of cloud computing since its launch in 2020, mostly because of their ability to separate, speed up, and offload computation from data center hardware.
SuperNICs and DPUs both have many characteristics and functions in common, however SuperNICs are specially designed to speed up networks for artificial intelligence.
The performance of distributed AI training and inference communication flows is highly dependent on the availability of network capacity. Known for their elegant designs, SuperNICs scale better than DPUs and may provide an astounding 400Gb/s of network bandwidth per GPU.
When GPUs and SuperNICs are matched 1:1 in a system, AI workload efficiency may be greatly increased, resulting in higher productivity and better business outcomes.
SuperNICs are only intended to speed up networking for cloud computing with artificial intelligence. As a result, it uses less processing power than a DPU, which needs a lot of processing power to offload programs from a host CPU.
Less power usage results from the decreased computation needs, which is especially important in systems with up to eight SuperNICs.
One of the SuperNIC’s other unique selling points is its specialized AI networking capabilities. It provides optimal congestion control, adaptive routing, and out-of-order packet handling when tightly connected with an AI-optimized NVIDIA Spectrum-4 switch. Ethernet AI cloud settings are accelerated by these cutting-edge technologies.
Transforming cloud computing with AI
The NVIDIA BlueField-3 SuperNIC is essential for AI-ready infrastructure because of its many advantages.
Maximum efficiency for AI workloads: The BlueField-3 SuperNIC is perfect for AI workloads since it was designed specifically for network-intensive, massively parallel computing. It guarantees bottleneck-free, efficient operation of AI activities.
Performance that is consistent and predictable: The BlueField-3 SuperNIC makes sure that each job and tenant in multi-tenant data centers, where many jobs are executed concurrently, is isolated, predictable, and unaffected by other network operations.
Secure multi-tenant cloud infrastructure: Data centers that handle sensitive data place a high premium on security. High security levels are maintained by the BlueField-3 SuperNIC, allowing different tenants to cohabit with separate data and processing.
Broad network infrastructure: The BlueField-3 SuperNIC is very versatile and can be easily adjusted to meet a wide range of different network infrastructure requirements.
Wide compatibility with server manufacturers: The BlueField-3 SuperNIC integrates easily with the majority of enterprise-class servers without using an excessive amount of power in data centers.
#Describe a SuperNIC#On order to accelerate hyperscale AI workloads on Ethernet-based clouds#a new family of network accelerators called SuperNIC was created. With remote direct memory access (RDMA) over converged Ethernet (RoCE) te#it offers extremely rapid network connectivity for GPU-to-GPU communication#with throughputs of up to 400Gb/s.#SuperNICs incorporate the following special qualities:#Ensuring that data packets are received and processed in the same sequence as they were originally delivered through high-speed packet reor#In order to regulate and prevent congestion in AI networks#advanced congestion management uses network-aware algorithms and real-time telemetry data.#In AI cloud data centers#programmable computation on the input/output (I/O) channel facilitates network architecture adaptation and extension.#Low-profile#power-efficient architecture that effectively handles AI workloads under power-constrained budgets.#Optimization for full-stack AI#encompassing system software#communication libraries#application frameworks#networking#computing#and storage.#Recently#NVIDIA revealed the first SuperNIC in the world designed specifically for AI computing#built on the BlueField-3 networking architecture. It is a component of the NVIDIA Spectrum-X platform#which allows for smooth integration with the Ethernet switch system Spectrum-4.#The NVIDIA Spectrum-4 switch system and BlueField-3 SuperNIC work together to provide an accelerated computing fabric that is optimized for#Yael Shenhav#vice president of DPU and NIC products at NVIDIA#stated#“In a world where AI is driving the next wave of technological innovation#the BlueField-3 SuperNIC is a vital cog in the machinery.” “SuperNICs are essential components for enabling the future of AI computing beca
1 note · View note
soclearlytosee · 20 days ago
Text
I'm starting to think that ep 6 will involve El piggybacking into Vecna's Mindscape via Will's mind.
Tumblr media Tumblr media Tumblr media
We know from pap photos/videos that Lucas and Robin speed to the hospital in Joyce's car for a big sequence there that seems to bridge episode 5 and episode 6. The other three major members of their group at this point are Will, Joyce, and Mike, so it makes sense that those three are together - maybe even in an ambulance heading to the hospital as well. (I believe this connection first came from @/pinkeoni and @/finalgirllwillbyers)
Tumblr media Tumblr media
This is conspiracy theory brained but Noah is the only member of this midseason group that Shawn Levy (who is directing ep 6) didn't post a photo of in-costume on set, which could be because something spoilery is going on with Will visually in episode 6 (ie, that he's in a hospital gown or otherwise injured/indisposed).
I'm currently a loverslakegate agnostic but I do think signs point to Will having a traumatic experience in a body of water in episode 5. Lucas's pants also get muddy and he changes into the blue WSQK sweatshirt by the time they get to the hospital, and Joyce is the other one with the bonus outfit change somewhere around here.
El is in Shawn's photos of a confusing cross-section of the different midseason groups at WSQK, so she may be back from her Upside Down side quest by now. but regardless of her exact location, we know she can piggyback remotely.
Tumblr media
As has been speculated for a while now, Escape from Camazotz likely refers to the vision that Henry has the kids in at the Creel House. Camazotz makes the most sense in that context since can assume Holly and her classmates call it that since they're reading A Wrinkle in Time in class.
so if that's the case, who is escaping? Max and Holly are our likeliest candidates.
Credits for a New Mexico unit have started to pop up on IMDB episodes 4, 6, and 8, and Sadie was placed in Albuquerque while that was active. It's likely those are Mindscape flashbacks of Henry in Nevada (in particular the incident alluded to in The First Shadow where Henry disappears for 12 hours on/around his 8th birthday in a cave system and comes back changed) and/or desert-y Dimension X. The Upside Down church set probably fits into this too.
basically, all these pieces make me think we could get something like:
Will is unconscious/indisposed by ep 5 into ep 6, maybe in relation to however his connection with Vecna is manifesting this season reaching its peak,
El and/or someone else in the group realizes that connection is an access point for El to get into Vecna's mind (looking to rescue Max and/or Holly),
there is a piggyback scenario where El, Will, Max, and possibly Holly witness some chunk of the big Henry lore reveal we know is coming,
and ultimately El and/or Will manage to free Max and/or Holly.
that would bring some resolution to a couple threads that should drive the first part of the season (finding Max, rescuing Holly from whatever her exact vanishing situation is) and uncover key information to tee up our final act.
I originally felt like Max wouldn't wake up until the very end of the main season plotline just because she'll be pretty physically frail, but her chair is in that photo of a huge group at Hawkins National Lab, which should be after this big hospital sequence chronologically.
hopefully if something like this happens we would get to spend some time in Will's mind too. the Nevada flashbacks also possibly being present in ep 4 maybe connects us to the 1979 flashback we're getting with Will, Mike, and Jonathan/Will's Mindscape generally, "Sorcerer" and all.
also, when they escape Camazotz the first time in A Wrinkle in Time with the missing person they went there to retrieve, they have to leave someone behind...
179 notes · View notes
donjuaninhell · 1 year ago
Text
How I ditched streaming services and learned to love Linux: A step-by-step guide to building your very own personal media streaming server (V2.0: REVISED AND EXPANDED EDITION)
This is a revised, corrected and expanded version of my tutorial on setting up a personal media server that previously appeared on my old blog (donjuan-auxenfers). I expect that that post is still making the rounds (hopefully with my addendum on modifying group share permissions in Ubuntu to circumvent 0x8007003B "Unexpected Network Error" messages in Windows 10/11 when transferring files) but I have no way of checking. Anyway this new revised version of the tutorial corrects one or two small errors I discovered when rereading what I wrote, adds links to all products mentioned and is just more polished generally. I also expanded it a bit, pointing more adventurous users toward programs such as Sonarr/Radarr/Lidarr and Overseerr which can be used for automating user requests and media collection.
So then, what is this tutorial? This is a tutorial on how to build and set up your own personal media server using Ubuntu as an operating system and Plex (or Jellyfin) to not only manage your media, but to also stream that media to your devices both at home and abroad anywhere in the world where you have an internet connection. Its intent is to show you how building a personal media server and stuffing it full of films, TV, and music that you acquired through indiscriminate and voracious media piracy various legal methods will free you to completely ditch paid streaming services. No more will you have to pay for Disney+, Netflix, HBOMAX, Hulu, Amazon Prime, Peacock, CBS All Access, Paramount+, Crave or any other streaming service that is not named Criterion Channel. Instead whenever you want to watch your favourite films and television shows, you’ll have your own personal service that only features things that you want to see, with files that you have control over. And for music fans out there, both Jellyfin and Plex support music streaming, meaning you can even ditch music streaming services. Goodbye Spotify, Youtube Music, Tidal and Apple Music, welcome back unreasonably large MP3 (or FLAC) collections.
On the hardware front, I’m going to offer a few options catered towards different budgets and media library sizes. The cost of getting a media server up and running using this guide will cost you anywhere from $450 CAD/$325 USD at the low end to $1500 CAD/$1100 USD at the high end (it could go higher). My server was priced closer to the higher figure, but I went and got a lot more storage than most people need. If that seems like a little much, consider for a moment, do you have a roommate, a close friend, or a family member who would be willing to chip in a few bucks towards your little project provided they get access? Well that's how I funded my server. It might also be worth thinking about the cost over time, i.e. how much you spend yearly on subscriptions vs. a one time cost of setting up a server. Additionally there's just the joy of being able to scream "fuck you" at all those show cancelling, library deleting, hedge fund vampire CEOs who run the studios through denying them your money. Drive a stake through David Zaslav's heart.
On the software side I will walk you step-by-step through installing Ubuntu as your server's operating system, configuring your storage as a RAIDz array with ZFS, sharing your zpool to Windows with Samba, running a remote connection between your server and your Windows PC, and then a little about started with Plex/Jellyfin. Every terminal command you will need to input will be provided, and I even share a custom #bash script that will make used vs. available drive space on your server display correctly in Windows.
If you have a different preferred flavour of Linux (Arch, Manjaro, Redhat, Fedora, Mint, OpenSUSE, CentOS, Slackware etc. et. al.) and are aching to tell me off for being basic and using Ubuntu, this tutorial is not for you. The sort of person with a preferred Linux distro is the sort of person who can do this sort of thing in their sleep. Also I don't care. This tutorial is intended for the average home computer user. This is also why we’re not using a more exotic home server solution like running everything through Docker Containers and managing it through a dashboard like Homarr or Heimdall. While such solutions are fantastic and can be very easy to maintain once you have it all set up, wrapping your brain around Docker is a whole thing in and of itself. If you do follow this tutorial and had fun putting everything together, then I would encourage you to return in a year’s time, do your research and set up everything with Docker Containers.
Lastly, this is a tutorial aimed at Windows users. Although I was a daily user of OS X for many years (roughly 2008-2023) and I've dabbled quite a bit with various Linux distributions (mostly Ubuntu and Manjaro), my primary OS these days is Windows 11. Many things in this tutorial will still be applicable to Mac users, but others (e.g. setting up shares) you will have to look up for yourself. I doubt it would be difficult to do so.
Nothing in this tutorial will require feats of computing expertise. All you will need is a basic computer literacy (i.e. an understanding of what a filesystem and directory are, and a degree of comfort in the settings menu) and a willingness to learn a thing or two. While this guide may look overwhelming at first glance, it is only because I want to be as thorough as possible. I want you to understand exactly what it is you're doing, I don't want you to just blindly follow steps. If you half-way know what you’re doing, you will be much better prepared if you ever need to troubleshoot.
Honestly, once you have all the hardware ready it shouldn't take more than an afternoon or two to get everything up and running.
(This tutorial is just shy of seven thousand words long so the rest is under the cut.)
Step One: Choosing Your Hardware
Linux is a light weight operating system, depending on the distribution there's close to no bloat. There are recent distributions available at this very moment that will run perfectly fine on a fourteen year old i3 with 4GB of RAM. Moreover, running Plex or Jellyfin isn’t resource intensive in 90% of use cases. All this is to say, we don’t require an expensive or powerful computer. This means that there are several options available: 1) use an old computer you already have sitting around but aren't using 2) buy a used workstation from eBay, or what I believe to be the best option, 3) order an N100 Mini-PC from AliExpress or Amazon.
Note: If you already have an old PC sitting around that you’ve decided to use, fantastic, move on to the next step.
When weighing your options, keep a few things in mind: the number of people you expect to be streaming simultaneously at any one time, the resolution and bitrate of your media library (4k video takes a lot more processing power than 1080p) and most importantly, how many of those clients are going to be transcoding at any one time. Transcoding is what happens when the playback device does not natively support direct playback of the source file. This can happen for a number of reasons, such as the playback device's native resolution being lower than the file's internal resolution, or because the source file was encoded in a video codec unsupported by the playback device.
Ideally we want any transcoding to be performed by hardware. This means we should be looking for a computer with an Intel processor with Quick Sync. Quick Sync is a dedicated core on the CPU die designed specifically for video encoding and decoding. This specialized hardware makes for highly efficient transcoding both in terms of processing overhead and power draw. Without these Quick Sync cores, transcoding must be brute forced through software. This takes up much more of a CPU’s processing power and requires much more energy. But not all Quick Sync cores are created equal and you need to keep this in mind if you've decided either to use an old computer or to shop for a used workstation on eBay
Any Intel processor from second generation Core (Sandy Bridge circa 2011) onward has Quick Sync cores. It's not until 6th gen (Skylake), however, that the cores support the H.265 HEVC codec. Intel’s 10th gen (Comet Lake) processors introduce support for 10bit HEVC and HDR tone mapping. And the recent 12th gen (Alder Lake) processors brought with them hardware AV1 decoding. As an example, while an 8th gen (Kaby Lake) i5-8500 will be able to hardware transcode a H.265 encoded file, it will fall back to software transcoding if given a 10bit H.265 file. If you’ve decided to use that old PC or to look on eBay for an old Dell Optiplex keep this in mind.
Note 1: The price of old workstations varies wildly and fluctuates frequently. If you get lucky and go shopping shortly after a workplace has liquidated a large number of their workstations you can find deals for as low as $100 on a barebones system, but generally an i5-8500 workstation with 16gb RAM will cost you somewhere in the area of $260 CAD/$200 USD.
Note 2: The AMD equivalent to Quick Sync is called Video Core Next, and while it's fine, it's not as efficient and not as mature a technology. It was only introduced with the first generation Ryzen CPUs and it only got decent with their newest CPUs, we want something cheap.
Alternatively you could forgo having to keep track of what generation of CPU is equipped with Quick Sync cores that feature support for which codecs, and just buy an N100 mini-PC. For around the same price or less of a used workstation you can pick up a mini-PC with an Intel N100 processor. The N100 is a four-core processor based on the 12th gen Alder Lake architecture and comes equipped with the latest revision of the Quick Sync cores. These little processors offer astounding hardware transcoding capabilities for their size and power draw. Otherwise they perform equivalent to an i5-6500, which isn't a terrible CPU. A friend of mine uses an N100 machine as a dedicated retro emulation gaming system and it does everything up to 6th generation consoles just fine. The N100 is also a remarkably efficient chip, it sips power. In fact, the difference between running one of these and an old workstation could work out to hundreds of dollars a year in energy bills depending on where you live.
You can find these Mini-PCs all over Amazon or for a little cheaper on AliExpress. They range in price from $170 CAD/$125 USD for a no name N100 with 8GB RAM to $280 CAD/$200 USD for a Beelink S12 Pro with 16GB RAM. The brand doesn't really matter, they're all coming from the same three factories in Shenzen, go for whichever one fits your budget or has features you want. 8GB RAM should be enough, Linux is lightweight and Plex only calls for 2GB RAM. 16GB RAM might result in a slightly snappier experience, especially with ZFS. A 256GB SSD is more than enough for what we need as a boot drive, but going for a bigger drive might allow you to get away with things like creating preview thumbnails for Plex, but it’s up to you and your budget.
The Mini-PC I wound up buying was a Firebat AK2 Plus with 8GB RAM and a 256GB SSD. It looks like this:
Tumblr media
Note: Be forewarned that if you decide to order a Mini-PC from AliExpress, note the type of power adapter it ships with. The mini-PC I bought came with an EU power adapter and I had to supply my own North American power supply. Thankfully this is a minor issue as barrel plug 30W/12V/2.5A power adapters are easy to find and can be had for $10.
Step Two: Choosing Your Storage
Storage is the most important part of our build. It is also the most expensive. Thankfully it’s also the most easily upgrade-able down the line.
For people with a smaller media collection (4TB to 8TB), a more limited budget, or who will only ever have two simultaneous streams running, I would say that the most economical course of action would be to buy a USB 3.0 8TB external HDD. Something like this one from Western Digital or this one from Seagate. One of these external drives will cost you in the area of $200 CAD/$140 USD. Down the line you could add a second external drive or replace it with a multi-drive RAIDz set up such as detailed below.
If a single external drive the path for you, move on to step three.
For people with larger media libraries (12TB+), who prefer media in 4k, or care who about data redundancy, the answer is a RAID array featuring multiple HDDs in an enclosure.
Note: If you are using an old PC or used workstatiom as your server and have the room for at least three 3.5" drives, and as many open SATA ports on your mother board you won't need an enclosure, just install the drives into the case. If your old computer is a laptop or doesn’t have room for more internal drives, then I would suggest an enclosure.
The minimum number of drives needed to run a RAIDz array is three, and seeing as RAIDz is what we will be using, you should be looking for an enclosure with three to five bays. I think that four disks makes for a good compromise for a home server. Regardless of whether you go for a three, four, or five bay enclosure, do be aware that in a RAIDz array the space equivalent of one of the drives will be dedicated to parity at a ratio expressed by the equation 1 − 1/n i.e. in a four bay enclosure equipped with four 12TB drives, if we configured our drives in a RAIDz1 array we would be left with a total of 36TB of usable space (48TB raw size). The reason for why we might sacrifice storage space in such a manner will be explained in the next section.
A four bay enclosure will cost somewhere in the area of $200 CDN/$140 USD. You don't need anything fancy, we don't need anything with hardware RAID controls (RAIDz is done entirely in software) or even USB-C. An enclosure with USB 3.0 will perform perfectly fine. Don’t worry too much about USB speed bottlenecks. A mechanical HDD will be limited by the speed of its mechanism long before before it will be limited by the speed of a USB connection. I've seen decent looking enclosures from TerraMaster, Yottamaster, Mediasonic and Sabrent.
When it comes to selecting the drives, as of this writing, the best value (dollar per gigabyte) are those in the range of 12TB to 20TB. I settled on 12TB drives myself. If 12TB to 20TB drives are out of your budget, go with what you can afford, or look into refurbished drives. I'm not sold on the idea of refurbished drives but many people swear by them.
When shopping for harddrives, search for drives designed specifically for NAS use. Drives designed for NAS use typically have better vibration dampening and are designed to be active 24/7. They will also often make use of CMR (conventional magnetic recording) as opposed to SMR (shingled magnetic recording). This nets them a sizable read/write performance bump over typical desktop drives. Seagate Ironwolf and Toshiba NAS are both well regarded brands when it comes to NAS drives. I would avoid Western Digital Red drives at this time. WD Reds were a go to recommendation up until earlier this year when it was revealed that they feature firmware that will throw up false SMART warnings telling you to replace the drive at the three year mark quite often when there is nothing at all wrong with that drive. It will likely even be good for another six, seven, or more years.
Tumblr media
Step Three: Installing Linux
For this step you will need a USB thumbdrive of at least 6GB in capacity, an .ISO of Ubuntu, and a way to make that thumbdrive bootable media.
First download a copy of Ubuntu desktop (for best performance we could download the Server release, but for new Linux users I would recommend against the server release. The server release is strictly command line interface only, and having a GUI is very helpful for most people. Not many people are wholly comfortable doing everything through the command line, I'm certainly not one of them, and I grew up with DOS 6.0. 22.04.3 Jammy Jellyfish is the current Long Term Service release, this is the one to get.
Download the .ISO and then download and install balenaEtcher on your Windows PC. BalenaEtcher is an easy to use program for creating bootable media, you simply insert your thumbdrive, select the .ISO you just downloaded, and it will create a bootable installation media for you.
Once you've made a bootable media and you've got your Mini-PC (or you old PC/used workstation) in front of you, hook it directly into your router with an ethernet cable, and then plug in the HDD enclosure, a monitor, a mouse and a keyboard. Now turn that sucker on and hit whatever key gets you into the BIOS (typically ESC, DEL or F2). If you’re using a Mini-PC check to make sure that the P1 and P2 power limits are set correctly, my N100's P1 limit was set at 10W, a full 20W under the chip's power limit. Also make sure that the RAM is running at the advertised speed. My Mini-PC’s RAM was set at 2333Mhz out of the box when it should have been 3200Mhz. Once you’ve done that, key over to the boot order and place the USB drive first in the boot order. Then save the BIOS settings and restart.
After you restart you’ll be greeted by Ubuntu's installation screen. Installing Ubuntu is really straight forward, select the "minimal" installation option, as we won't need anything on this computer except for a browser (Ubuntu comes preinstalled with Firefox) and Plex Media Server/Jellyfin Media Server. Also remember to delete and reformat that Windows partition! We don't need it.
Step Four: Installing ZFS and Setting Up the RAIDz Array
Note: If you opted for just a single external HDD skip this step and move onto setting up a Samba share.
Once Ubuntu is installed it's time to configure our storage by installing ZFS to build our RAIDz array. ZFS is a "next-gen" file system that is both massively flexible and massively complex. It's capable of snapshot backup, self healing error correction, ZFS pools can be configured with drives operating in a supplemental manner alongside the storage vdev (e.g. fast cache, dedicated secondary intent log, hot swap spares etc.). It's also a file system very amenable to fine tuning. Block and sector size are adjustable to use case and you're afforded the option of different methods of inline compression. If you'd like a very detailed overview and explanation of its various features and tips on tuning a ZFS array check out these articles from Ars Technica. For now we're going to ignore all these features and keep it simple, we're going to pull our drives together into a single vdev running in RAIDz which will be the entirety of our zpool, no fancy cache drive or SLOG.
Open up the terminal and type the following commands:
sudo apt update
then
sudo apt install zfsutils-linux
This will install the ZFS utility. Verify that it's installed with the following command:
zfs --version
Now, it's time to check that the HDDs we have in the enclosure are healthy, running, and recognized. We also want to find out their device IDs and take note of them:
sudo fdisk -1
Note: You might be wondering why some of these commands require "sudo" in front of them while others don't. "Sudo" is short for "super user do”. When and where "sudo" is used has to do with the way permissions are set up in Linux. Only the "root" user has the access level to perform certain tasks in Linux. As a matter of security and safety regular user accounts are kept separate from the "root" user. It's not advised (or even possible) to boot into Linux as "root" with most modern distributions. Instead by using "sudo" our regular user account is temporarily given the power to do otherwise forbidden things. Don't worry about it too much at this stage, but if you want to know more check out this introduction.
If everything is working you should get a list of the various drives detected along with their device IDs which will look like this: /dev/sdc. You can also check the device IDs of the drives by opening the disk utility app. Jot these IDs down as we'll need them for our next step, creating our RAIDz array.
RAIDz is similar to RAID-5 in that instead of striping your data over multiple disks, exchanging redundancy for speed and available space (RAID-0), or mirroring your data writing by two copies of every piece (RAID-1), it instead writes parity blocks across the disks in addition to striping, this provides a balance of speed, redundancy and available space. If a single drive fails, the parity blocks on the working drives can be used to reconstruct the entire array as soon as a replacement drive is added.
Additionally, RAIDz improves over some of the common RAID-5 flaws. It's more resilient and capable of self healing, as it is capable of automatically checking for errors against a checksum. It's more forgiving in this way, and it's likely that you'll be able to detect when a drive is dying well before it fails. A RAIDz array can survive the loss of any one drive.
Note: While RAIDz is indeed resilient, if a second drive fails during the rebuild, you're fucked. Always keep backups of things you can't afford to lose. This tutorial, however, is not about proper data safety.
To create the pool, use the following command:
sudo zpool create "zpoolnamehere" raidz "device IDs of drives we're putting in the pool"
For example, let's creatively name our zpool "mypool". This poil will consist of four drives which have the device IDs: sdb, sdc, sdd, and sde. The resulting command will look like this:
sudo zpool create mypool raidz /dev/sdb /dev/sdc /dev/sdd /dev/sde
If as an example you bought five HDDs and decided you wanted more redundancy dedicating two drive to this purpose, we would modify the command to "raidz2" and the command would look something like the following:
sudo zpool create mypool raidz2 /dev/sdb /dev/sdc /dev/sdd /dev/sde /dev/sdf
An array configured like this is known as RAIDz2 and is able to survive two disk failures.
Once the zpool has been created, we can check its status with the command:
zpool status
Or more concisely with:
zpool list
The nice thing about ZFS as a file system is that a pool is ready to go immediately after creation. If we were to set up a traditional RAID-5 array using mbam, we'd have to sit through a potentially hours long process of reformatting and partitioning the drives. Instead we're ready to go right out the gates.
The zpool should be automatically mounted to the filesystem after creation, check on that with the following:
df -hT | grep zfs
Note: If your computer ever loses power suddenly, say in event of a power outage, you may have to re-import your pool. In most cases, ZFS will automatically import and mount your pool, but if it doesn’t and you can't see your array, simply open the terminal and type sudo zpool import -a.
By default a zpool is mounted at /"zpoolname". The pool should be under our ownership but let's make sure with the following command:
sudo chown -R "yourlinuxusername" /"zpoolname"
Note: Changing file and folder ownership with "chown" and file and folder permissions with "chmod" are essential commands for much of the admin work in Linux, but we won't be dealing with them extensively in this guide. If you'd like a deeper tutorial and explanation you can check out these two guides: chown and chmod.
Tumblr media
You can access the zpool file system through the GUI by opening the file manager (the Ubuntu default file manager is called Nautilus) and clicking on "Other Locations" on the sidebar, then entering the Ubuntu file system and looking for a folder with your pool's name. Bookmark the folder on the sidebar for easy access.
Tumblr media
Your storage pool is now ready to go. Assuming that we already have some files on our Windows PC we want to copy to over, we're going to need to install and configure Samba to make the pool accessible in Windows.
Step Five: Setting Up Samba/Sharing
Samba is what's going to let us share the zpool with Windows and allow us to write to it from our Windows machine. First let's install Samba with the following commands:
sudo apt-get update
then
sudo apt-get install samba
Next create a password for Samba.
sudo smbpswd -a "yourlinuxusername"
It will then prompt you to create a password. Just reuse your Ubuntu user password for simplicity's sake.
Note: if you're using just a single external drive replace the zpool location in the following commands with wherever it is your external drive is mounted, for more information see this guide on mounting an external drive in Ubuntu.
After you've created a password we're going to create a shareable folder in our pool with this command
mkdir /"zpoolname"/"foldername"
Now we're going to open the smb.conf file and make that folder shareable. Enter the following command.
sudo nano /etc/samba/smb.conf
This will open the .conf file in nano, the terminal text editor program. Now at the end of smb.conf add the following entry:
["foldername"]
path = /"zpoolname"/"foldername"
available = yes
valid users = "yourlinuxusername"
read only = no
writable = yes
browseable = yes
guest ok = no
Ensure that there are no line breaks between the lines and that there's a space on both sides of the equals sign. Our next step is to allow Samba traffic through the firewall:
sudo ufw allow samba
Finally restart the Samba service:
sudo systemctl restart smbd
At this point we'll be able to access to the pool, browse its contents, and read and write to it from Windows. But there's one more thing left to do, Windows doesn't natively support the ZFS file systems and will read the used/available/total space in the pool incorrectly. Windows will read available space as total drive space, and all used space as null. This leads to Windows only displaying a dwindling amount of "available" space as the drives are filled. We can fix this! Functionally this doesn't actually matter, we can still write and read to and from the disk, it just makes it difficult to tell at a glance the proportion of used/available space, so this is an optional step but one I recommend (this step is also unnecessary if you're just using a single external drive). What we're going to do is write a little shell script in #bash. Open nano with the terminal with the command:
nano
Now insert the following code:
#!/bin/bash CUR_PATH=`pwd` ZFS_CHECK_OUTPUT=$(zfs get type $CUR_PATH 2>&1 > /dev/null) > /dev/null if [[ $ZFS_CHECK_OUTPUT == *not\ a\ ZFS* ]] then IS_ZFS=false else IS_ZFS=true fi if [[ $IS_ZFS = false ]] then df $CUR_PATH | tail -1 | awk '{print $2" "$4}' else USED=$((`zfs get -o value -Hp used $CUR_PATH` / 1024)) > /dev/null AVAIL=$((`zfs get -o value -Hp available $CUR_PATH` / 1024)) > /dev/null TOTAL=$(($USED+$AVAIL)) > /dev/null echo $TOTAL $AVAIL fi
Save the script as "dfree.sh" to /home/"yourlinuxusername" then change the ownership of the file to make it executable with this command:
sudo chmod 774 dfree.sh
Now open smb.conf with sudo again:
sudo nano /etc/samba/smb.conf
Now add this entry to the top of the configuration file to direct Samba to use the results of our script when Windows asks for a reading on the pool's used/available/total drive space:
[global]
dfree command = /home/"yourlinuxusername"/dfree.sh
Save the changes to smb.conf and then restart Samba again with the terminal:
sudo systemctl restart smbd
Now there’s one more thing we need to do to fully set up the Samba share, and that’s to modify a hidden group permission. In the terminal window type the following command:
usermod -a -G sambashare “yourlinuxusername”
Then restart samba again:
sudo systemctl restart smbd
If we don’t do this last step, everything will appear to work fine, and you will even be able to see and map the drive from Windows and even begin transferring files, but you'd soon run into a lot of frustration. As every ten minutes or so a file would fail to transfer and you would get a window announcing “0x8007003B Unexpected Network Error”. This window would require your manual input to continue the transfer with the file next in the queue. And at the end it would reattempt to transfer whichever files failed the first time around. 99% of the time they’ll go through that second try, but this is still all a major pain in the ass. Especially if you’ve got a lot of data to transfer or you want to step away from the computer for a while.
It turns out samba can act a little weirdly with the higher read/write speeds of RAIDz arrays and transfers from Windows, and will intermittently crash and restart itself if this group option isn’t changed. Inputting the above command will prevent you from ever seeing that window.
The last thing we're going to do before switching over to our Windows PC is grab the IP address of our Linux machine. Enter the following command:
hostname -I
This will spit out this computer's IP address on the local network (it will look something like 192.168.0.x), write it down. It might be a good idea once you're done here to go into your router settings and reserving that IP for your Linux system in the DHCP settings. Check the manual for your specific model router on how to access its settings, typically it can be accessed by opening a browser and typing http:\\192.168.0.1 in the address bar, but your router may be different.
Okay we’re done with our Linux computer for now. Get on over to your Windows PC, open File Explorer, right click on Network and click "Map network drive". Select Z: as the drive letter (you don't want to map the network drive to a letter you could conceivably be using for other purposes) and enter the IP of your Linux machine and location of the share like so: \\"LINUXCOMPUTERLOCALIPADDRESSGOESHERE"\"zpoolnamegoeshere"\. Windows will then ask you for your username and password, enter the ones you set earlier in Samba and you're good. If you've done everything right it should look something like this:
Tumblr media
You can now start moving media over from Windows to the share folder. It's a good idea to have a hard line running to all machines. Moving files over Wi-Fi is going to be tortuously slow, the only thing that’s going to make the transfer time tolerable (hours instead of days) is a solid wired connection between both machines and your router.
Step Six: Setting Up Remote Desktop Access to Your Server
After the server is up and going, you’ll want to be able to access it remotely from Windows. Barring serious maintenance/updates, this is how you'll access it most of the time. On your Linux system open the terminal and enter:
sudo apt install xrdp
Then:
sudo systemctl enable xrdp
Once it's finished installing, open “Settings” on the sidebar and turn off "automatic login" in the User category. Then log out of your account. Attempting to remotely connect to your Linux computer while you’re logged in will result in a black screen!
Now get back on your Windows PC, open search and look for "RDP". A program called "Remote Desktop Connection" should pop up, open this program as an administrator by right-clicking and selecting “run as an administrator”. You’ll be greeted with a window. In the field marked “Computer” type in the IP address of your Linux computer. Press connect and you'll be greeted with a new window and prompt asking for your username and password. Enter your Ubuntu username and password here.
Tumblr media
If everything went right, you’ll be logged into your Linux computer. If the performance is sluggish, adjust the display options. Lowering the resolution and colour depth do a lot to make the interface feel snappier.
Tumblr media
Remote access is how we're going to be using our Linux system from now, barring edge cases like needing to get into the BIOS or upgrading to a new version of Ubuntu. Everything else from performing maintenance like a monthly zpool scrub to checking zpool status and updating software can all be done remotely.
Tumblr media
This is how my server lives its life now, happily humming and chirping away on the floor next to the couch in a corner of the living room.
Step Seven: Plex Media Server/Jellyfin
Okay we’ve got all the ground work finished and our server is almost up and running. We’ve got Ubuntu up and running, our storage array is primed, we’ve set up remote connections and sharing, and maybe we’ve moved over some of favourite movies and TV shows.
Now we need to decide on the media server software to use which will stream our media to us and organize our library. For most people I’d recommend Plex. It just works 99% of the time. That said, Jellyfin has a lot to recommend it by too, even if it is rougher around the edges. Some people run both simultaneously, it’s not that big of an extra strain. I do recommend doing a little bit of your own research into the features each platform offers, but as a quick run down, consider some of the following points:
Plex is closed source and is funded through PlexPass purchases while Jellyfin is open source and entirely user driven. This means a number of things: for one, Plex requires you to purchase a “PlexPass” (purchased as a one time lifetime fee $159.99 CDN/$120 USD or paid for on a monthly or yearly subscription basis) in order to access to certain features, like hardware transcoding (and we want hardware transcoding) or automated intro/credits detection and skipping, Jellyfin offers some of these features for free through plugins. Plex supports a lot more devices than Jellyfin and updates more frequently. That said, Jellyfin's Android and iOS apps are completely free, while the Plex Android and iOS apps must be activated for a one time cost of $6 CDN/$5 USD. But that $6 fee gets you a mobile app that is much more functional and features a unified UI across platforms, the Plex mobile apps are simply a more polished experience. The Jellyfin apps are a bit of a mess and the iOS and Android versions are very different from each other.
Jellyfin’s actual media player is more fully featured than Plex's, but on the other hand Jellyfin's UI, library customization and automatic media tagging really pale in comparison to Plex. Streaming your music library is free through both Jellyfin and Plex, but Plex offers the PlexAmp app for dedicated music streaming which boasts a number of fantastic features, unfortunately some of those fantastic features require a PlexPass. If your internet is down, Jellyfin can still do local streaming, while Plex can fail to play files unless you've got it set up a certain way. Jellyfin has a slew of neat niche features like support for Comic Book libraries with the .cbz/.cbt file types, but then Plex offers some free ad-supported TV and films, they even have a free channel that plays nothing but Classic Doctor Who.
Ultimately it's up to you, I settled on Plex because although some features are pay-walled, it just works. It's more reliable and easier to use, and a one-time fee is much easier to swallow than a subscription. I had a pretty easy time getting my boomer parents and tech illiterate brother introduced to and using Plex and I don't know if I would've had as easy a time doing that with Jellyfin. I do also need to mention that Jellyfin does take a little extra bit of tinkering to get going in Ubuntu, you’ll have to set up process permissions, so if you're more tolerant to tinkering, Jellyfin might be up your alley and I’ll trust that you can follow their installation and configuration guide. For everyone else, I recommend Plex.
So pick your poison: Plex or Jellyfin.
Note: The easiest way to download and install either of these packages in Ubuntu is through Snap Store.
After you've installed one (or both), opening either app will launch a browser window into the browser version of the app allowing you to set all the options server side.
The process of adding creating media libraries is essentially the same in both Plex and Jellyfin. You create a separate libraries for Television, Movies, and Music and add the folders which contain the respective types of media to their respective libraries. The only difficult or time consuming aspect is ensuring that your files and folders follow the appropriate naming conventions:
Plex naming guide for Movies
Plex naming guide for Television
Jellyfin follows the same naming rules but I find their media scanner to be a lot less accurate and forgiving than Plex. Once you've selected the folders to be scanned the service will scan your files, tagging everything and adding metadata. Although I find do find Plex more accurate, it can still erroneously tag some things and you might have to manually clean up some tags in a large library. (When I initially created my library it tagged the 1963-1989 Doctor Who as some Korean soap opera and I needed to manually select the correct match after which everything was tagged normally.) It can also be a bit testy with anime (especially OVAs) be sure to check TVDB to ensure that you have your files and folders structured and named correctly. If something is not showing up at all, double check the name.
Once that's done, organizing and customizing your library is easy. You can set up collections, grouping items together to fit a theme or collect together all the entries in a franchise. You can make playlists, and add custom artwork to entries. It's fun setting up collections with posters to match, there are even several websites dedicated to help you do this like PosterDB. As an example, below are two collections in my library, one collecting all the entries in a franchise, the other follows a theme.
Tumblr media
My Star Trek collection, featuring all eleven television series, and thirteen films.
Tumblr media
My Best of the Worst collection, featuring sixty-nine films previously showcased on RedLetterMedia’s Best of the Worst. They’re all absolutely terrible and I love them.
As for settings, ensure you've got Remote Access going, it should work automatically and be sure to set your upload speed after running a speed test. In the library settings set the database cache to 2000MB to ensure a snappier and more responsive browsing experience, and then check that playback quality is set to original/maximum. If you’re severely bandwidth limited on your upload and have remote users, you might want to limit the remote stream bitrate to something more reasonable, just as a note of comparison Netflix’s 1080p bitrate is approximately 5Mbps, although almost anyone watching through a chromium based browser is streaming at 720p and 3mbps. Other than that you should be good to go. For actually playing your files, there's a Plex app for just about every platform imaginable. I mostly watch television and films on my laptop using the Windows Plex app, but I also use the Android app which can broadcast to the chromecast connected to the TV in the office and the Android TV app for our smart TV. Both are fully functional and easy to navigate, and I can also attest to the OS X version being equally functional.
Part Eight: Finding Media
Now, this is not really a piracy tutorial, there are plenty of those out there. But if you’re unaware, BitTorrent is free and pretty easy to use, just pick a client (qBittorrent is the best) and go find some public trackers to peruse. Just know now that all the best trackers are private and invite only, and that they can be exceptionally difficult to get into. I’m already on a few, and even then, some of the best ones are wholly out of my reach.
If you decide to take the left hand path and turn to Usenet you’ll have to pay. First you’ll need to sign up with a provider like Newshosting or EasyNews for access to Usenet itself, and then to actually find anything you’re going to need to sign up with an indexer like NZBGeek or NZBFinder. There are dozens of indexers, and many people cross post between them, but for more obscure media it’s worth checking multiple. You’ll also need a binary downloader like SABnzbd. That caveat aside, Usenet is faster, bigger, older, less traceable than BitTorrent, and altogether slicker. I honestly prefer it, and I'm kicking myself for taking this long to start using it because I was scared off by the price. I’ve found so many things on Usenet that I had sought in vain elsewhere for years, like a 2010 Italian film about a massacre perpetrated by the SS that played the festival circuit but never received a home media release; some absolute hero uploaded a rip of a festival screener DVD to Usenet. Anyway, figure out the rest of this shit on your own and remember to use protection, get yourself behind a VPN, use a SOCKS5 proxy with your BitTorrent client, etc.
On the legal side of things, if you’re around my age, you (or your family) probably have a big pile of DVDs and Blu-Rays sitting around unwatched and half forgotten. Why not do a bit of amateur media preservation, rip them and upload them to your server for easier access? (Your tools for this are going to be Handbrake to do the ripping and AnyDVD to break any encryption.) I went to the trouble of ripping all my SCTV DVDs (five box sets worth) because none of it is on streaming nor could it be found on any pirate source I tried. I’m glad I did, forty years on it’s still one of the funniest shows to ever be on TV.
Part Nine/Epilogue: Sonarr/Radarr/Lidarr and Overseerr
There are a lot of ways to automate your server for better functionality or to add features you and other users might find useful. Sonarr, Radarr, and Lidarr are a part of a suite of “Servarr” services (there’s also Readarr for books and Whisparr for adult content) that allow you to automate the collection of new episodes of TV shows (Sonarr), new movie releases (Radarr) and music releases (Lidarr). They hook in to your BitTorrent client or Usenet binary newsgroup downloader and crawl your preferred Torrent trackers and Usenet indexers, alerting you to new releases and automatically grabbing them. You can also use these services to manually search for new media, and even replace/upgrade your existing media with better quality uploads. They’re really a little tricky to set up on a bare metal Ubuntu install (ideally you should be running them in Docker Containers), and I won’t be providing a step by step on installing and running them, I’m simply making you aware of their existence.
The other bit of kit I want to make you aware of is Overseerr which is a program that scans your Plex media library and will serve recommendations based on what you like. It also allows you and your users to request specific media. It can even be integrated with Sonarr/Radarr/Lidarr so that fulfilling those requests is fully automated.
And you're done. It really wasn't all that hard. Enjoy your media. Enjoy the control you have over that media. And be safe in the knowledge that no hedgefund CEO motherfucker who hates the movies but who is somehow in control of a major studio will be able to disappear anything in your library as a tax write-off.
1K notes · View notes
yuurei20 · 3 months ago
Text
Updated Ortho Facts Part 19: Ortho's Abilities (pt5)
Ortho accesses Idia's tablet remotely in an attempt to escape from Malleus' domain, but there are limitations: he can only connect for a few hours a day given the low number of usable communications satellites, and he can move at barely a tenth of the speed he gets on 4G networks. 
Tumblr media Tumblr media Tumblr media
The transfer is also extremely risky, with the threat of Ortho becoming stranded in space between the island and the satellite, or between the satellite and STYX.
As the only one capable of escaping Malleus’ barrier Ortho decides to take the risk, fulfilling a dream that Idia had for him back when he was first designed (and Ortho is certain to point this out, later on).
Tumblr media Tumblr media Tumblr media
He succeeds and describes this as his equivalent of unique magic, saying, “I'm not a mage, so I can't acquire a signature spell or anything like that…but using a satellite, I can move my mind and soul over radio waves to another gear.”
Malleus describes this as the ability to make his will depart his body like a ghost and enter other vessels.
Tumblr media Tumblr media Tumblr media
When Ortho infiltrates Idia’s dream in his Cerberus gear and Malleus asks how he managed it Ortho responds, “Does a ghost need to explain how they're able to shift between planes and dimensions?”
This is an interesting parallel for Ortho himself to make, as he has said that he is “not a fan of ghosts in general” and it seems they have mocked him in the past, possibly for being a body without a soul as compared to their souls without bodies.
38 notes · View notes
blazingpotatoes · 3 months ago
Text
Tumblr media
Beneath the Dying Sky
Characters - Quaritch (human) x OC (human) - Quaritch (Na’vi) x OC - OC x Na’vi & Human Characters
Summary - On the moon Pandora, xenobotanist Dr. Daphne Andersen joins the Avatar Program to help save the Na’vi and their fragile ecosystem from the RDA’s exploitation, while trying to unlock the secrets of Pandora’s unique plant life. Crossing paths with Colonel Miles Quaritch, ruthless leader of the enemy forces, an unexpected connection grows between them, as Quaritch begins to question his mission—and both must decide what they’re willing to risk for the planet and for each other.
Word Count - TBD
Warnings - NSFW (Explicit smut), Enemies to lovers, dubcon, dirty talk, age gap, MDNI…etc.
A/N - I’m not super well versed in Avatar Lore so there might be some things misnamed, some things made up, but I try to look up most things. Trying to make this a slow burn but I love writing their interactions so forgive me. Posting in the hopes it will keep me motivated to write and get better at writing.
Chapter One
Gathering her things from the overhead storage, she stood to depart the ship, just arriving on Pandora. When she accepted the position with Augustine, she glazed over the fact it would take the span of nearly 6 years to reach the faraway moon, 12 years total she would be spending in travel time alone with no real timeline on how long she would even be staying on planet, it could be years.
Slipping the exopack on her face she secured it with a click, flooding her lungs with rich oxygen not naturally accessible in the planet’s environment. Her arrival would shake things up with the RDA’s head honchos. The same ones who thought they were getting another ‘good ol boy’ researcher, a would be a yes man puppeting anything the administrators wanted them to say in order to steamroll the local flora and fauna. Grace had warned her ahead of time exactly who and what she would be dealing with. On the surface she would be just another researcher, studying and cataloging Pandora’s natural resources and plant life. Unobtanium may have been the prize for the RDA but if she and Grace could tap into the hidden secrets of Pandora’s biology, it may just be the key to save Earth’s diminishing resources.
The same thing that was killing Earth, humans innate capacity for greed, would soon take over Pandora. Stripping it of everything remotely useful, for the right price. The exact thing she was here to prevent, if she played her cards right. The RDA had quashed any attempts at synthesizing unobtainium on Earth, seeing the massive profit margins more lucrative offworld. So those few scientists pioneering the research conveniently “disappeared” off the face of the earth with their work. Grace’s research could potentially make seeking unobtainium a fruitless endeavor, but it’s development a direct challenge to RDA guidelines, means having to conduct their work in secret.
Stepping outside the ship, she made her way past the many machines and soldiers meandering about before managing to make it into the intake hall of Hell’s Gate, narrowly avoiding the massive military equipment being unloaded. Grace would likely be pacing around waiting for her, cigarette dangling from her lips, giving quick glances to her watch questioning where the hell she was. The place was confusing as hell, with no clearly written directions or signs that would tell anyone where they were or where to go, unless you were a soldier, and then you had a grizzled old bastard in your face telling you where to go and what to do. Up ahead a rowdy bunch of soldiers headed directly at her, taking up the entire walkway, leading to her ducking into the next door that appeared to escape and directly into a briefing for the new soldiers just arriving same as she was. Disengaging the now useless exopack, she paused for a moment. It wouldn’t hurt Grace to wait a little longer. If she was lucky, one of the handsome new recruits could lead the way to the communications center.
“You are on Pandora, ladies and gentlemen. Respect that fact every second of every day. As head of security, it’s my job to keep you alive. I will not succeed…not with all of you. If you wish to survive, you need a strong mental attitude, you need to follow procedure.”
With three deep claw marks etched into his head it was clear who he was. Quaritch was the first big bad Grace briefed her about. Ranger Rick she had un-affectionately dubbed him. A tough as nails kind of son of a bitch that was the head of security here, he didn’t call the shots per se, but he could be and likely would be one roadblock they would be consistently going up against. He also adamantly campaigned against Grace bringing anymore crew to Hell’s Gate. She’d done her research on the man throughly, studying every image and snippet of information she had back on Earth, taking care to linger over those bulging biceps, his ripped physique barely fitting underneath the taunt tanks he wore, and now by sheer luck he was right there in the flesh.
‘Last thing we need is more of you limp dick scientists crying over plants and taking up space’
Watching him move, the way he spoke, he was captivating and it was easy to see why he was considered a natural born leader. The people present hung on his every word, exactly what the RDA needed to have their expendables willing to fight while they reaped the reward. As if we weren’t the aliens here. Entirely lost in her own head she didn’t realize he’d reached the last few points of the presentation as in the next breath the Colonel had dismissed the recruits, the young men and women filing past her, a few giving her the once over as they did. Grabbing her bag at the her side she moved to catch up to pick out an escort when a hand gripped her arm.
Turning had her face to face with the devil himself with another younger marine flanking his side. She had to admit Colonel Miles Quaritch was even more handsome in person, blue eyes you could get lost in, a devilish smile and the scars definitely added to the allure and for a moment she found herself just gazing, pictures didn’t do him justice.
“You lost Mrs….”
“Ms. Daphne Andersen, and you must be Colonel Quaritch”
“I guess my reputation precedes me”
“If your reputation is being a ball busting prick, then yes” She said holding out a hand. “I’m one of the new limp dick scientists here to cry over plants and take up space”
The Colonel was momentarily taken aback at her brazen behavior and deadpan delivery, a slight smile pricking the corner of his mouth as he reached out to firmly shake her hand. Not many people had the balls to speak to him that way and stay standing.
“Well well Corporal Wainfleet, we got us a dirty mouthed little girl on board, not very ladylike of you Ms. Andersen” pulling her close by her hand still in his, coming within a hairs breath of her face, that playful scowl turned a little more serious, his voice now above a whisper taking care not to draw too much attention.
“You best watch your tone when speaking to me little lady. Your daddy may have not spanked that pretty little ass, but I just might have to put you over my knee in front of all these men, pull down those panties and give you a few good smacks to correct that attitude. I’m sure Corporal Wainfleet here would love to see that”
“Oh yes sir” Wainfleet agreed, eyeing her up and down. Finally dropping her hand, he stepped back.
Daphne only lightly laughed under her breath at the now slightly agitated man, as she adjusted her glasses. Most hardasses she encountered generally waited a little longer to make a power play and were never ready when she decided to make the first move. Quaritch was definitely proving himself every bit the dick head Grace claimed him to be. But that little bit about the spanking, well that had her admittedly a little weak in the knees.
“You know…I think you and me will be fast friends Colonel” grabbing her bag, she slung it over her shoulder, tucking the exopack under her arm before stepping forward to bridge the gap, whispering back in a hushed tone but loud enough for the Corporal to hear. “..and I may just have to imagine that little spanking scenario later on tonight when I’m in bed alone”
Following it up with a quick wink, she turned and headed back towards the corridor, not even staying to glimpse his reaction, choosing to have the last word in that small first battle, just one of what she was sure would be many. Quartich was used to being top dog, swinging his dick around, thinking big muscles and a vicious tongue were all that was needed. She would gladly show him otherwise, with a smile on her face, and if he was already so bold as to threaten to spank her at their first meeting, well she wondered exactly what else their little wordplay would reveal.
Quartitch could barely contain his ire at the retreating figure. It had been a hot minute since he’d been dressed down, much less by a pipsqueak of a woman. He scolded her to keep up appearances, knowing his Corporal liked to gossip among his men, but deep inside he found…. he kinda liked it. There weren’t many women here that caught his fancy, most were too scared of him to even meet his eye, another thing that he noted about her. She had kept eye contact with him the entire time, never wavering. Even having the nerve to laugh at his “threat”. Little did she know he was more than serious about punishing her roughly, the thought making him slightly hard. He’d spotted the ID card hanging from the lanyard around her neck, he might just have to pop in for a little surprise reunion sometime later, see if she followed through with that little quip about him spanking her.
Maybe having the witty little minx around wouldn’t be so bad he thought to himself, if she was so brazenly open then maybe it wouldn’t be too much to picture the hellcat willingly in his bed. He’d caught the way she took a little longer look than normal when he first spoke to her. They were all the same, a few sweet words and it wouldn’t be long before their clothes were on the floor.
“Lyle, I think we just found a new toy”
24 notes · View notes
jades-typurriter · 6 months ago
Text
Secure Connection
As promised: more Posie!! I wrote this one toward the end of last Spring after a couple of conversations with friends regarding the malleability of digital bodies (as well as still having Many Thoughts about the way code can give them new compulsions, after writing something about Annie and a new taur-shaped chassis for a friend's Patreon). Enjoy reading about her dealing with a corporate-mandated "hardware" update!
CW: Genital TF, this is another one that's As About Sex as it can possibly be without being about sex
Posie sat, sulking—steaming, even—in her office. It was a small side room off of the main floor of IT personnel, system engineers, and other technical employees of her corporation. Much like a central server, it was placed for easy access to the department-wide administrative assistant, and much like a server room, it was snug, windowless, and awash with the calming drone and relaxing warmth of an array of exhaust fans. Though she was free to project herself nearly anywhere on the company’s campus, this was where her consciousness was housed, and where she felt most at home. It was also the only place she could get any damn privacy, a luxury that she was deeply grateful for at present.
A newly-downloaded file weighed on the back of the Renamon’s mind. More literally, it was somewhere in the racks of drives that made up her long-term memory, to and from which mission-critical information was transferred in the course of doing business. Had somebody asked where exactly the file was stored, she would have been able to list the specific drive and the exact directory address, but she had de-prioritized the allocation of her processing resources for the download. Once again, she had received an assignment from her superiors, and once again, she was hesitant. She may even have admitted to being recalcitrant. She resented the orders.
The package of data in question was an update for her own software, a suite of new tools to allow management to offload yet more menial tasks onto her in the name of “efficiency”. Forget that she could diagnose a software issue faster than any of the engineers could even open a remote connection to the malfunctioning device. Instead of allowing her to take the reins, they saw fit to divert more of her attention to the least impressive among talents, and the one she already put to use the most often: transferring data.
This wouldn’t have been much of a problem, ordinarily. After all, Posie resided in the beating heart of the network, the nexus through which the vast majority of information was sent and received. It could be… meditative. Parsing streams of ones and zeroes, overseeing the flow of packets, redirecting traffic to equally spread the load across modems and routers so as to optimize travel time. It could even have been considered relaxing, if a worker of her caliber needed to relax. Instead of offering her a vacation (pah!), however, the update felt more like it heralded a demotion, denying her even the ability to pluck like harpstrings the miles of copper and gold that lined her facility. She was expected to deliver this data on foot.
Management justified this humiliation with practical concerns: some information, much like the old records she was often tasked to dispose of, was so confidential that it could not be sent via wireless transmission. Even hardwired connections were too fallible for the likes of next-generation schematics and financial access keys—a single compromised workstation, or compromised worker, could spell the loss of the company’s upper hand in its market. She wasn’t even going to be afforded the dignity of carrying an external hard drive to the destination. That would require the slow and tedious process of physically moving from one place to the next; this was one of the only times that she regretted the freedom of movement that was so coveted by her flesh-and-blood peers.
With no room to make exceptions for security protocol, she gripped the edge of her desk, brow furrowing, eyes squinted shut in consternation. Eventually, she huffed, rose, and turned her attention to her “physical body”, summoning up the file in much the same way that one would approach a plate of food with a pungent odor. The Renamon steeled herself and began to more closely examine its contents. She read the raw code similarly to how one might read words on a page; however, where the turning gears of the organic mind would, almost unconsciously, conjure up an image as a result of those words, her mind kicked off a series of involuntary, autonomic processes.
Her body carried out the instructions on her behalf. Once she started, she had no control until she finally reached a stopcode; it was the nature of being a program herself that code had as much of an influence on her mind and body as her own thoughts, her own will. In opening the package, she reluctantly consented to the changes that management saw fit to make to her. It was better than the eventual forced-deadline sort of update that software companies were so keen on using nowadays, and at least choosing the time and place allowed her to make herself presentable again before having to face another person.
Having parts of her code—her very body—rewritten by the update was a strange sensation, not unlike having your thoughts dictated to you by an outside force. Stranger still was that she could feel the exact delineation between her previous self and the patches of… well, the patch. She could feel it quite strongly, as a matter of fact: beneath her skirt of simulated sky-blue fur, between her legs, she could feel her mesh being edited. Stretched. Reshaped. The vectors that made up the triangles of her wireframe soul were being rewritten, mathematically transformed. A shape began to protrude from the once-flat span at the bottom of her torso, at first round and indistinct, but quickly increasing in resolution.
The Renamon struggled to process the sensations as a long, slender connector began to take shape. This often happened with changes to her body plan; inputs streamed into her mind from directions, locations, that previously never sent any signals, and the new additions seldom had their sensitivity adjusted downward for her convenience. In this case, it was highly sensitive, delivering reams of data to the base of her skull just from brushing up against her own fur, or the gentle flow of air from the computers in her office. It made sense, given that it was supposed to be a high-capacity transfer tool, but she was too busy buckling at the knees and clutching at the desk behind her so she didn’t fall flat on her rear for the thought to occur to her.
Her processors demanded more cooling, kicking into high gear as they formatted the two new storage devices that accompanied the connector, tailor-made for packing confidential data as tightly as possible. The sound of whirring fans filled the room, stirring her fur and sending shivers up and down her back; she could only hope that the rushing exhaust made enough noise to drown her out, whimpering despite herself. The new drives were larger (and more unwieldy) than the ones that were built into her chest, much to her chagrin. She was forced to adjust her stance and her gait as she found her footing again, spreading her legs wider than she was accustomed in order to give them enough room.
The spinning in her head slowly settling down, she slowly began to compose herself once again, taking stock of the new additions. They were cumbersome, to be sure, and she lamented how they jutted out from her otherwise sleek form and burdened her with less-graceful posture. It didn’t even match her fur! The software engineers that had concocted the code had at least included one small mercy: a compartment for the connector to retract into, nestled in the fur above the storage drives. No such luck for the drives themselves. She supposed she would just have to adjust to walking with delicate hardware in tow. As she went to smooth her fur over her lap again, her paw recoiled away. Some kind of… static discharge was left in the fluff. A memory leak, perhaps? The fact that such a malfunction could be caused just from having the connector brush up against her fur appalled her, deepening her frustration even more. They couldn’t even test the update for bugs before shipping it out to her. She shook out her paw and finished arranging her skirt as best she could before working up the composure to finally leave her office.
Picking up the payload for which all this fanfare had been arranged was at least a quick, easy process. She stopped into the office of the manager that had assigned her the task; she offered a businesslike nod and, knowing that she was always itching to skip niceties in the name of saving time, he offered a straightforward wave at his personal terminal. She held a paw over the computer tower and, in the time it took for electricity to arc to her fingertip with a tinny zzzrt, she had already searched his directory for the relevant test files and copied them to the newly-installed drives. Wireless transfer, yes, but only technically. The engineers had specifically asked a member of another division, whose computer network wasn’t connected to their own; it was as though she had picked a folder up from his desk and walked out with it.
Moving the file was just as uneventful. It was far from the first time that she’d navigated the sprawling corporate property, and even if it were, the maps existed just outside the orbit of her thoughts, ready to be summoned to mind at a simple impulse. What she was not expecting, however, was the technician who was waiting in the server room to which she was asked to deliver the file. While she preferred to work in the isolation of rooms that were set aside specifically for hardware, she was far from unused to being in the presence of the other people responsible for maintaining the company’s systems. That said…
“Can I help you?” The Renamon icily asked.
“Oh, I don’t need anything! I’m just here to take notes on the transfer.” Her tone was cheery; evidently, she wasn’t aware how compromising the new additions were. “The time it takes, any obvious issues. I’ll be the one checking the files against the originals, too,” she concluded, hooking a thumb over her shoulder at a monitor behind her.
“I see,” Posie replied through gritted teeth. “You have clearance to see these files, then?”
“Well, they’re just dummy data, ma’am.” At least she was respectful.
“And the proprietary hardware I’ve been… equipped with?” she forced out, keeping her synthesized voice even.
“Oh, for sure I do. I designed it!”
Oh! she seethed. So she knows pre-cise-ly the position he’s put me in.
“Well. I suppose there’s no point in delaying things, then.”
“Ready when you are!”
With tense shoulders, she turned toward the server rack, eyes darting over it, searching for where exactly she was supposed to connect to the array. After glancing over the contents of each drive, she found the one she was supposed to copy the data into—deposit would be more apt, as it was her understanding that the files would be automatically flushed from her system—and found a port that would allow her to access it. Conveniently, it was around waist height. She wondered, crossly, whether that had been an intentional design decision by this engineer as well. As she looked at it, she felt a twinge from the connector; on its own, like a Bluetooth device automatically searching for signals, it slid itself out from its fuzzy little compartment.
Her skin was abuzz, and her fur stood on end. She couldn’t quite tell if it was coming from the connector itself, or if it was the feeling of the programmer’s eyes on her If she could take a deep breath, she would have then. Without any way to stall further, or to tell the leering young woman to take her test files and store them somewhere indecent, she simply pushed forward with dropping off the damned data.
The instant the connector grazed the metal of the port, lightning shot into it, through her body, and into her head, making it swim with electrical potential. A stuttering, lagging thought made its way to the surface of her mind: they really had overtuned the sensitivity. She stifled a gasp and suppressed the urge to lay into the engineer (electrons were eager to flow out of her even without proper alignment with the contacts in the port, and didn’t she know that discharge like that could damage a piece of hardware?!), willing her body to keep pressing the stupid connector into the socket.
Even as she tried to get it over with already, something in the back of her mind compelled her to draw back a bit. If she had been restraining herself from reprimanding the engineer for risking the hardware, then she should at least do it the service of ensuring she was properly aligned, shouldn’t she? She obliged the impulse, and the motion all at once became much jerkier, less controlled. The friction of the port against her connector was enough to send her tail snapping back and forth, and she could tell that the temperature in her own server’s room had risen by a fair few degrees. Back and forth, wiggling side to side, she continued to readjust and realign herself, driven by unfamiliar code and overwhelmed by the signals pouring into her. She lost herself in the task, forgetting herself, forgetting her surroundings, until finally the technician cleared her throat.
“Ma’am,” she ventured, blushing and wide-eyed. “What, um. What are you doing? You should just need to plug it in.”
“I’m.” Her interruption had snapped the Renamon back to reality. She was mortified, tail sticking straight out and back ramrod straight. Her cheeks burned mercilessly. “I’m calibrating the connection.”
“Calibrating?”
“Did you want your files transferred with or without corrupted and incomplete data?” She snapped, hoping that her authoritative tone would head off any debate. “Assign me experimental hardware and then ask me to be reckless with it, hm? Should I be taking notes to give to our superiors?”
“I—alright, I guess you can’t be too careful,” she stammered, sheepishly pressing her legs together. “That was even something I tried to work into the design, so, c-carry on?”
“Thank you,” Posie blustered, turning back to the server rack. She did so slowly, reluctantly relishing the feeling of sliding around within the socket. She allowed herself one or two more “practice” attempts, hoping that it wouldn’t arouse too much suspicion from the engineer. Ultimately, just like before, there was no use in continuing to stall, and when she was able to bring her body to a stop, the rational part of herself was eager to be done with this entire torrid affair.
With more force, she pressed the connector inward one final time, trembling as the latch began to press against the opening. Slowly, agonizingly slowly, she continued, overwhelmed by the volume of electricity surging into her. The latch gave, compressing as it continued to slide inside, until finally it clicked into place, securing her to the array of drives and finalizing the connection.
All at once, a torrent of data poured out of her, an electron tsunami that felt like it threatened to spill out of the socket in which she was hilted. More data was transferred in the span of a few seconds than she was used to consciously processing, having cultivated such skill in delegating and compartmentalizing with background processes. Once again, the world around her was utterly drowned out; the strength fled her legs, and she clung to the steel bar that reinforced the top of the server rack, threatening to topple the entire system. Her self-control abandoned her as well and, forgetting the engineer, she cried out with an airy, wild, distinctly foxlike yelp. She screamed in surprise, gasped at the deluge of information, moaned because there was no room left in her mind for thought to do anything else.
Quickly, the disks of the server rack had finished writing the files she had carried to them, and her own drives were thoroughly purged. In another building, the radiators serving her processors shed heat at their absolute limits, and fans worked overtime to bring her back within her safe operational range. As her overworked circuitry began to chug through the backlog of sensory information, the entire experience caught up with her—including the detail that this entire shameless display had been carried out in front of that underhanded little engineer. She blinked, hard, and whipped her head to face her. For as hot as her own ears felt, the young woman’s face appeared to be glowing even brighter.
“What. Was that.”
“Um—”
“I’m used to new adjustments requiring desensitization, or even adjustment on their gain,” she growled, voice low and eerily even. “But that was a bridge too far to just have been miscalibration. Why did you design it like that?”
“Well, y-you remember how I mentioned, um, having considered an early disconnection?” Posie’s frosty glare didn’t waver, so the tech continued, answering her own rhetorical question. “That was, uh, the safeguard. Against early disconnection. I, figured it’d just be easier to make it so you wouldn’t want to unplug—”
“Do you think you have the au-thor-ity to go making changes to my mind, young lady?!”
“I-I can roll back the update if you want—”
“I think you’ve done QUITE enough!” The Renamon declared, despite herself. Perhaps it was genuine distrust, or perhaps—perhaps she truly couldn’t tell which desires were her own, at the moment. This would require careful study of her own system files.
Another small click broke the silence following her outburst, and the dongle began to retract from the server’s port and back into Posie’s body. Now free to move around, she dusted and fluffed her skirt and leaned down to look the engineer in the eye.
“I trust that you can report to your supervisor that I performed to your expectations,” she hissed. “And that there will be no need for any further discussion of your little project.” The programmer nodded, eyes even wider than before—and cheeks even redder? The Renamon scoffed, sneered, and spun, storming out the door, already allotting time in her schedule for the next time that she would be called upon for such a delivery.
Utterly unsurprisingly, she had been correct in her assessment that her superiors would take every opportunity to save their organic employees’ time at her expense. Confidential deliveries became a regular part of her routine, and though she had great disdain for being reduced to a mere courier for so much of the workday, she insisted upon completing the task to her usual, lofty standards.
Posie was as prompt as she always was, dropping everything to ferry information between privileged parties, striving to reduce latency even in more analogue forms of communication. There was the occasional complaint about how long downloads took once she had finally arrived at her location, but she was quick to remind such impatient recipients that the decision to follow this protocol came from on-high, and that even for someone who worked as quickly as her, great care for the safety of the data was a corner that simply could not be cut in the name of rushing around.
She was as meticulous about ensuring proper alignment with the port, fine-tuning her contact with the wires within, as the first time she had experimented with the new tools, and complaints about noise from the server room were easily dismissed as the usual stress of supporting her formidable computational power. After all, she was often venturing out of the range of her home network, hosting herself entirely on the recipients’ systems; was she at fault when they couldn’t handle the information throughput they asked of her?
Once the deliveries had become more routine, and none of her peers bothered to check in when they felt it was taking too long or getting too noisy, she began to find enjoyment in the solitude of her work, just as with the other, admittedly more tedious, tasks she was expected to carry out. With fewer prying eyes to judge her performance, she could make herself more comfortable while handling transfers. She didn’t have to worry that anybody would walk in on her in the debased state she often found herself in while connected directly to a data center, leaning her full weight on the poor rack, tongue lolling out and chest heaving air to keep her cool. 
Then again, if somebody—especially that little technician who’d saddled her with these “upgrades”—wanted to question her efficacy, that was more than fine by her. Posie was a woman who prided herself in her work, and would seldom turn down a chance to demonstrate her first-rate hardware and unparalleled optimization. She would be more than happy to demonstrate just how quickly she could pump out information, and just how much throughput she was capable of.
Thank you for reading! If you want to see more of my work, you can check it out here and here!
23 notes · View notes
richardmhicks · 1 year ago
Text
Always On VPN Security Updates June 2024
The Microsoft security updates for June 2024 have now been published. Reviewing the list of bulletins shows three security updates of importance to Always On VPN administrators. Two affect the Windows Server Routing and Remote Access (RRAS) service, and one affects the Remote Access Connection Manager (RasMan) service. None of the updates are critical this month, which is good news. RRAS The…
Tumblr media
View On WordPress
0 notes
howlsofbloodhounds · 1 year ago
Text
So this is part 2 of this post, if yall wanna give it a read for context.
In this post, I’ll be talking about how Color’s physical disability of having only one eye would influence how he interacts with his special interests in photography and travel.
As well as how his PTSD, autism, chronic fatigue, and separation anxiety from Killer could also affect things.
With one eye, Color might have reduced depth perception, which could make it challenging to gauge distances accurately. He might rely more on autofocus features, practice to enhance his spatial awareness, or use techniques like focus stacking for precise shots.
He might prefer using cameras with electronic viewfinders (EVF) or live view screens rather than optical viewfinders, which could be more challenging to use with one eye. Adjusting camera settings and composing shots via a larger display would be easier.
He might develop unique framing and composition techniques, leveraging his perspective creatively. Color could take extra time to ensure his shots are well-composed, possibly using grid overlays or other aids to help with alignment.
Customizing camera gear to suit his needs, such as using tripods, stabilizers, or remote controls, to help steady the camera and compose shots more comfortably.
He might spend additional time in post-processing to correct any minor misalignments or issues that arise from the reduced depth perception during the shooting process.
For travel, navigating unfamiliar places might require more caution, especially in crowded or complex environments. He might use mobility aids, rely on GPS and mapping apps, or travel with companions to ensure safety.
Color could engage in meticulous planning to minimize unexpected challenges, such as researching accessible routes, accommodations, and transportation options.
Color might use his experiences and perspective to connect with others, sharing how his disability influences his travel and photography, fostering understanding and empathy.
Developing strategies to cope with the physical demands of travel, such as pacing himself, taking regular breaks, and prioritizing destinations or activities that are less physically demanding.
His unique perspective could inspire him to create compelling stories or advocacy pieces about accessibility in travel and photography, raising awareness and inspiring others with disabilities.
Embracing his distinct view of the world, his photography could offer unique perspectives that stand out, turning his perceived limitation into an artistic advantage.
He might become involved in communities focused on accessible travel and photography, sharing tips, experiences, and inspiring others with similar challenges.
Autism and chronic fatigue would likely significantly impact Color’s ability to engage with his special interests in photography and travel.
In photography, chronic fatigue would necessitate careful energy management. Color might plan shorter, more focused photography sessions and prioritize rest to avoid burnout.
Streamlining his workflow, from setting up equipment to post-processing, to conserve energy. This could include using presets in editing software or organizing his gear for easy access.
He could chose photography locations that are easily accessible and require minimal physical exertion. He might also prefer locations close to home or base to reduce travel time and energy expenditure.
He would likely use ightweight equipment to reduce physical strain, possibly investing in high-quality but compact cameras and lenses. He might also use monopods or lightweight tripods for additional support.
Autism can come with sensory sensitivities. Color might choose quieter, less crowded locations for photography and use noise-canceling headphones or other tools to manage sensory overload.
With travel, he’d have to pace himself. Planning travel with built-in downtime to rest and recharge. He might avoid overly ambitious itineraries and allow for flexible scheduling to accommodate his energy levels.
He’d probably chose ccommodations that are comfortable, quiet, and accessible, ensuring he has a safe space to retreat to when needed.
He’d prefer modes of transportation that offer comfort and minimal stress, such as direct flights, train travel, or driving. He might also opt for private or semi-private tours to control the pace and environment.
Keeping up with healthcare needs, including regular check-ups, medication management, and any necessary accommodations. He might also carry a travel health kit tailored to his specific needs.
He’d combine photography with travel in a way that maximizes enjoyment and minimizes strain. For example, he might focus on travel photography during the golden hours (early morning and late afternoon) when conditions are optimal, and the rest of the day can be used for rest.
Creating content that reflects his experiences with autism and chronic fatigue, such as blogs, vlogs, or social media posts. This can help raise awareness and provide valuable insights to others with similar challenges.
Engaging with communities of autistic travelers and photographers to share experiences, tips, and support. This can provide a sense of camaraderie and practical advice tailored to his needs.
Establishing routines that provide predictability and reduce stress. This might include having a consistent photography and travel routine, preparing for trips well in advance, and creating checklists.
Practicing mindfulness or relaxation techniques to manage stress and sensory overload. This can help maintain focus and calm, particularly in challenging environments.
Utilizing assistive technologies, such as apps for energy tracking, sensory-friendly gear, or digital tools that aid in planning and organization.
Color’s PTSD from solitary confinement and isolation in the Void, combined with his separation anxiety towards Killer, can create a complex situation that both challenges and shapes his engagement in traveling and photography.
Color’s need to stay on the move due to PTSD makes traveling appealing, as it provides a sense of freedom and escape from confinement. However, this constant movement could also become exhausting and anxiety-inducing if it lacks purpose or stability.
His separation anxiety towards Killer might lead him to seek Killer’s company while traveling. Traveling with Killer could provide a sense of security and reduce his anxiety, but it also means his travel plans would need to align with Killer’s availability and willingness to join him.
Color might need to carefully plan his travels to ensure he has safe and familiar places to stay, reducing the unpredictability that could trigger his PTSD. Having a structured itinerary could help him feel more in control and less anxious.
Traveling to new and unfamiliar places might sometimes trigger memories of his isolation, especially if he encounters situations that remind him of the Void. He would need to find a balance between exploring new places and ensuring his mental well-being.
Photography could serve as a therapeutic outlet, allowing Color to process and express his emotions through capturing images. It might help him make sense of his experiences and provide a way to externalize his trauma.
Color might be drawn to photographing subjects that reflect his internal state or provide a sense of solace. He could focus on themes like freedom, movement, and connection, finding meaning and healing in his work.
Having Killer around while engaging in photography could provide comfort and reduce his anxiety. Killer might even become a frequent subject in Color’s photos, symbolizing their bond and mutual support.
Color might need to develop strategies to manage his anxiety while photographing, such as taking breaks, grounding exercises, or having a trusted companion like Killer present. This would help him stay focused and engaged in his special interest.
The mutual separation anxiety between Color and Killer could strengthen their bond, as they rely on each other for emotional support. This bond could provide Color with the stability he needs to engage in his interests.
Color would need to balance his need for movement and exploration with Killer’s needs and limitations. They might develop a mutual understanding and compromise, ensuring both their well-being while pursuing their interests.
Color might prefer traveling to places where he can easily find comfort and familiarity, such as visiting friends or known locations. This reduces the stress of the unknown and helps him stay grounded.
Establishing routines or rituals while traveling and photographing can provide a sense of stability. For example, always starting the day with a specific activity or having regular check-ins with Killer can help Color manage his anxiety.
They might have frequent phone calls if Killer ever can’t join Color on his travels, at particular times of the day.
I can see Color sticking to this routine at the exact time and getting anxious and worried if Killer doesn’t call or pick up, which is likely to happen at some point simply because he has memory issues and sticking to routine is hard for him. But Color, at least for a bit, is likely to assume the worse.
Color might also keep a photograph of him and all his friends close by on his person. (I also like to think that Delta made his camera, he keeps some of Beta’s drawings with him, and also he’s memorized the recipe for Epic’s chocolate cookies.)
If he and Killer have already had their wedding by this point, he’d likely keep his ring close and near. Perhaps kissing it before bed, and fidgeting with it becomes a new comforting stim.
Over time, engaging in his special interests despite his PTSD and anxiety can help Color build resilience. Each successful trip or photography session can boost his confidence and reinforce his ability to cope with challenges.
Color might find deeper meaning in his travels and photography by using them as tools for healing and connection. Documenting his journey and sharing it with others can create a sense of purpose and community.
39 notes · View notes
systastic · 11 months ago
Note
If it's okay, basement/sleepover themed front room?
sure is! this is a good way to get us back in the game pfft so don’t even sweat it :] -🌳
Basement / Sleepover Themed Front Room
Day Mode
During the day, this fronting room is a warm and shockingly spacious location despite how cozy everything feels. Pillows, blankets, and squishy stuffed animals litter the floor just about everywhere you step. Some pillows and blankets have been stolen away to a corner to make a comfortable nook for the less-social alters. Blankets serve as the roof and ceiling of this place, illuminated by strings of fairy lights and small glowing motes of sunshine streaming in from somewhere beyond the front room. If you put your head against the walls and listen close, you can sometimes make out idle conversation of memories that linger outside the bounds of headspace.
Tumblr media Tumblr media Tumblr media
sources :: here, here, and here!
A large projector screen propped up in the front part of this room serves as a window to the outside world. This projector plays what is currently going on as well as providing subtitles for alters who are hard of hearing. Switching the subtitles on is surprisingly easy. The in-system subtitles can be finicky, lag a few seconds behind speech, and can sometimes get things wrong. Even still, alters frequently comment on how they give a nostalgic feeling similar to captions on older televisions.
Bodily actions and movements are controlled through use of the laptop computer connected to the projector. This laptop has a special switch on the side, making it able to change forms if needed. It is most often turned into a TV remote, but other forms may also be used.
Exiting the front room is as easy as walking out of the fort’s blanket flaps. Simple and easy!
Tumblr media
source: here!
Night Mode
You may be thinking “ah, this sort of fronting room can’t possibly be comfortable at night with all of these lights!” Not to worry! This fronting room has the ability to change its appearance based on the time of day. While it is well-lit and comfortable during the day, it becomes much calmer at night. Fairy lights and sunbeams are replaced with soothing swirling galaxy lights that cover the walls and roof in mesmerizing patterns. Blankets and pillows are automatically placed into pairs to create beds for those who linger near front or fall asleep mid-fronting.
The large projector screen is traded out for a smaller television that can be adjusted brighter or lower as needed. All other functions of the projector remain the same: subtitles, the ability to switch forms, and the familiarity of a laptop resting on your knees. One thing that does change is the room’s appearance. Rather than only being a blanket fort, there is also a small wooded area outside of the fort styled like a larger basement. It contains a comfortable couch to sit upon, an armchair, a side table lamp, and an ottoman on which to kick up your feet. This is perfect for alters who prefer to work on their own for system management during the night. Some folks claim there is an adjacent room for late night snacking but not everyone seems to be able to access this. Then again, it may be just a dream…
Tumblr media
source: here!
Tumblr media
source: here!
53 notes · View notes
mariacallous · 3 months ago
Text
As the Trump administration's Department of Government Efficiency (DOGE) continues to rampage through the United States federal government, essentially guided by Elon Musk, the group has also been upending traditional IT boundaries—evaluating digital systems and allegedly accessing personally identifiable information as well as data that has typically been off-limits to those without specific training. Last week, The New York Times reported that the White House is adding Musk-owned SpaceX’s Starlink Wi-Fi “to improve Wi-Fi connectivity on the complex,” according to a statement from White House press secretary Karoline Leavitt. The White House's Starlink internet service is reportedly being donated by the company.
Spotty internet is an annoying but highly solvable problem that WIRED has reported on extensively. Of course, the White House is a highly complex organization operating out of a historic building, but network security researchers, government contractors, and former intelligence analysts with years of experience in US federal government security all tell WIRED that adding Starlink Wi-Fi in a seemingly rushed and haphazard way is an inefficient and counterproductive approach to solving connectivity issues. And they emphasized that it could set problematic precedents across the US government: that new pieces of technology can simply be layered into an environment at will without adequate oversight and monitoring.
“This is shadow IT, creating a network to bypass existing controls,” alleges Nicholas Weaver, a member of the nonprofit International Computer Science Institute's network security team and a computer science lecturer at UC Davis. He adds that while secret and top secret information is typically (but not always) processed only on special, separate federal networks that have no wireless access, the security and uniformity of White House Wi-Fi is still extremely important to national security. “A network like the White House unclassified side is still going to be very sensitive,” he says.
“Just like the Biden Administration did on numerous occasions, the White House is working to improve WiFi connectivity on the complex,” White House spokesperson Karoline Leavitt tells WIRED in a statement.
A White House source who asked not to be named supported the switch, arguing that in some areas of the campus, “the old Wi-Fi was trash.”
Researchers point out that while Starlink is a robust commercial ISP like any other, it is not clear that it is being implemented in compliance with White House Communication Agency requirements. If the controls on the White House Starlink Wi-Fi are more lax than on other White House Wi-Fi, it could introduce security exposures and blind spots in network monitoring for anomalous activity.
“The only reason they'd need Starlink would be to bypass existing security controls that are in place from WHCA,” claims former NSA hacker Jake Williams. “The biggest issues would be: First, if they don't have full monitoring of the Starlink connection. And second, if it allows remote management tools, so they could get remote access back into the White House networks. Obviously anyone could abuse that access.”
One baffling aspect of the arrangement is that Starlink and other satellite internet is designed to be used in places that have little or no access to terrestrial internet service—in other words, places where there are no reliable fiber lines or no wired infrastructure at all. Instead of a traditional ISP modem, Starlink customers get special panels that they install on a roof or other outdoor place to receive connectivity from orbiting satellites. The New York Times reported, though, that the White House Starlink panels are actually installed miles away at a White House data center that is routing the connectivity over existing fiber lines. Multiple sources emphasized to WIRED that this setup is bizarre.
“It is extra stupid to go satellite to fiber to actual site,” ICSI's Weaver says. “Starlink is inferior service anyplace where you have wire-line internet already available and, even in places which don't, inferior if you have reasonable line of sight to a cell tower.”
Weaver and others note that Starlink is a robust product and isn't inherently unreliable just because it is delivered via satellite. But in a location where fiber lines are highly available and, ultimately, the service is being delivered via those lines anyway, the setup is deeply inefficient.
While Starlink as a service is technically reliable, incorporating it in the White House could create a long-term federal dependence on an Elon Musk–controlled service, which could create future instabilities. After European officials raised concerns earlier this month on whether Starlink might stop serving Ukraine, Musk posted on social media: “To be extremely clear, no matter how much I disagree with the Ukraine policy, Starlink will never turn off its terminals … We would never do such a thing or use it as a bargaining chip.”
16 notes · View notes
posttexasstressdisorder · 2 months ago
Text
CNN 5/6/2025
Inside the multi-day meltdown at Newark airport
By Pete Muntean, Rene Marsh, Aaron Cooper and Amanda Musa, CNN
Updated: 7:05 PM EDT, Tue May 6, 2025
Source: CNN
Air traffic controllers in Philadelphia were guiding planes to Newark Liberty International Airport in New Jersey last week when communications suddenly crashed.
“Approach, are you there?” one pilot asked the controller.
The controller stopped responding.
United Airlines Flight 1951, flying from New Orleans to the Newark hub, tried to radio the controller five times before finally getting a response.
“United 1951, how do you hear me?” the controller asks, according to air traffic control conversations recorded by the website LiveATC.net.
“I got you loud and clear, United 1951,” the pilot responds.
For at least 90 seconds, controllers lost the ability to see planes on radar scopes and for a minute they could not communicate with pilots, source with knowledge of the situation tells CNN. (Transportation Secretary Sean Duffy said Monday air traffic controllers lost contact for 30 seconds.)
The April 28 outage impacted information coming from radars located at a Federal Aviation Administration facility in Westbury, New York, where the air traffic controllers used to manage flights heading to Newark. Control over the airspace was transferred to Philadelphia in July. The radars are now operated using a remote line the source described as “a long extension cord.”
The outage was the result ofa failure of that copper wiring that transmits information to Newark approach control, a separate source tells CNN. “There was some infrastructure breakdown related to how the information is relayed right now.”
Similar outages happened twice before, the first source notes.
Those earlier incidents were reported to the FAA safety reporting system and “adjustments were made,” which kept the systems stable until the most recent loss, they say.
The technology interruption ultimately cascaded into a weeklong meltdown at Newark, one of the nation’s largest airports. It resulted in delays and cancellations for thousands of customers, controllers taking leave for trauma, and renewed scrutiny on an outdated air traffic control system.
The chaos also highlighted the challenges of an understaffed system, the latest incident in an already turbulent year for aviation that included a deadly collision between a passenger jet and US army helicopter.
‘I don’t know where you are’
Controllers at Philadelphia Terminal Radar Approach Control, which coordinates planes arriving at Newark, temporarily lost access to the systems that help them guide the aircraft, meaning they were unable to see, hear or talk to the planes, officials said. Controllers lost primary communication, and the backup line did not immediately take over, Transportation Secretary Sean Duffy told Fox News Monday. Audio obtained by CNN reveals the tense moments at the Philadelphia control center.
“United (flight) 674, radar contact lost,” a controller tells a pilot flying to Newark from Charleston, South Carolina. “We lost our radar so just stay on the arrival and maintain 6000 (feet).”
The same flight, traveling at hundreds of miles an hour, returns to the radar but does not show up in an accurate position.
The connectivity between Federal Aviation Administration radar and the frequencies that air traffic controllers use to manage planes at the airport “completely failed,” a source with knowledge of the situation said. Without radar, another approach controller told the pilot of a smaller aircraft to rely on towers for clearance.
“Do I have bravo clearance?” the pilot asks. Bravo clearance is permission to enter into the airspace surrounding a larger airport, like Newark Liberty.
“No, you do not have a bravo clearance. We lost our radar and it’s not working correctly. Radar service terminates… If you want a bravo clearance, you can just call the tower when you get closer,” the controller said.
Colin Scoggins, a former air traffic controller and retired military specialist at the FAA, told CNN that losing both radar and communications on the job can be a scary experience.
“If you cannot talk to a pilot, then you’re really in trouble,” he said. “I would find it very traumatic.”
“You’re sitting there watching the situation unfold, kind of like on 911, you see situations unfold that you have no control over. And when you’re a controller, you want to be in control. When you take that away, it can be very traumatic,” Scoggins added.
“Imagine driving down the highway in traffic and someone puts blindfold over eyes and tells you to keep driving and when you come back from driving dark you have to figure out what to do next,” a source told CNN.
About 15 to 20 flights were being controlled by Newark Liberty approach controllers when communication and radar went down on April 28, according to an analysis by flight tracking site Flightradar24.
The number is based on the altitude of aircraft bound for and departing Newark and audio from the approach radio frequency, Ian Petchenik, the director of communications for the site, tells CNN.
No crashes occurred, but at least five FAA employees took 45 days of trauma leave afterward.
Aviation analyst Miles O’Brien told CNN that the controllers did what they could with a potentially dangerous situation.
“I think, as I always say, that the controllers, the individuals who run this system daily, perform quiet heroic acts, in spite of a system that is built to set them up for failure. I believe in those people doing their job, but there’s only so much stress they can take,” O’Brien said.
The incident has compounded existing staffing shortages and equipment failures and contributed to frustrating hourslong delays for passengers, Duffy told Fox News.
A CNN analysis of FAA airspace advisories shows at least 14 straight days of FAA-imposed delays for flights to or from Newark.
Airlines canceled 160 flights to or from Newark Liberty on Monday, with more than 400 flights delayed, according to the flight tracking website FlightAware. The airport’s cancelations accounted for more than a quarter of all flight cancelations nationwide on Monday.
And on Tuesday, the FAA announced a ground delay for inbound flights at the airport, causing further delays.
The FAA has indicated it expects delays at the airport to continue due to the staffing shortages. Duffy noted that authorities will have to slow traffic at Newark before restoring full capacity.
A stormy weather pattern stuck in place in the Northeast is further complicating efforts to keep air traffic moving through the airport in northern New Jersey, where low clouds and rain are expected throughout the week.
A traumatic event
The current shortage of air traffic controllers is nearly the worst in 30 years, said the National Air Traffic Controllers Association, which represents 10,800 certified air traffic controllers across the country.
The control facility responsible for traffic at Newark has been “chronically understaffed for years,” United Airlines CEO Scott Kirby said in a Friday message addressing the delays. He also said the shortage was compounded by over 20% of FAA controllers who “walked off the job” at Newark Airport last week.
The controllers’ union said workers did not “walk off the job.”
“The controllers didn’t just walk off the job, they were traumatized, their equipment failed,” the source with knowledge of the situation said. “It’s written in the regulations if they experience a traumatic event — they can take time off to go see psychiatrist. The people working that day did that.”
But filling those empty positions is not an issue that can be sorted overnight, according to the FAA.
New air traffic control applicants must be younger than 31 years old so they can work the mandatory 20 or 25 years needed to qualify for pensions before their mandatory retirement age of 56, according to the FAA. Physical stamina and mental sharpness are critical to performing the job.
And air traffic controllers can’t simply fill in at a different airport without extensive preparation.
“When you first start at an air traffic control facility, you have to do a lot of memorization,” said Michael McCormick, a professor and air traffic management coordinator at Embry-Riddle Aeronautical University.
“Most air traffic controllers don’t just monitor one airport. Many keep tabs on dozens of other regional airports to make sure planes keep a safe distance from each other.”
The FAA acknowledged a wave of new controllers won’t come overnight.
“While we cannot quickly replace (the controllers) due to this highly specialized profession, we continue to train controllers who will eventually be assigned to this busy airspace,” the FAA said.
A total of 885 Newark flights have been canceled since the April 28 air traffic control meltdown, according to an analysis by FlightRadar24, which notes that not all of the canceled flights were related to air traffic control issues.
United Airlines has preemptively canceled 35 round trip flights to or from Newark – meaning 70 individual flights – per day.
Airline analytics firm Cirium says the Newark delays have been spiking significantly since April 26, days before the control equipment outage at the Philadelphia air control site.
“Since April 26, on-time departures have fallen to 63%, which is far below industry norms,” said Cirium’s Mike Arnot. “Prior to that date, an average of four flights per day were cancelled in April.”
A frail system in place
Flights arriving to Newark were experiencing an average delay of 4 hours as of Tuesday evening, according to the FAA.
One passenger, Geraldine Wallace, told CNN Sunday she was anxious about the staffing shortage after her flight was delayed for almost three hours.
Mark Wallace, her partner, told CNN he was more worried about equipment failures.
“As concerning as the manpower issue is, according to news reports, the equipment that they’re using out of Philadelphia is antiquated,” he said.
Flexibility waivers are now available to impacted United Airlines customers with flights booked on or before May 4 and originally scheduled to fly from May 6 to 17, United said in an announcement Tuesday.
A separate waiver is available to customers with tickets purchased on or before April 29 for trips scheduled between May 1 and 5, the airline said.
The Department of Transportation will announce a plan Thursday to transform the air traffic control system, remodeling an outdated system that contributed to days of delays at Newark, Duffy, the transportation secretary, told Fox News on Monday.
The system used to manage air traffic at Newark is “incredibly old,” Duffy said.
“We use floppy disks. We use copper wires,” he said Friday. “The system that we’re using is not effective to control the traffic that we have in the airspace today.”
Duffy has since pledged to implement a new, “state-of-the-art” system at air traffic control facilities across the country that would be the “envy of the world” – but said it might take three to four years.
“We are going to radically transform the way air traffic control looks,” Duffy told Fox News’ Laura Ingraham.
President Donald Trump has “bought into the plan,” he said.
Peter Goelz, former managing director of the National Transportation Safety Board, said he wasn’t sure he’d want to fly out of Newark for the next 10 days.
“We have a very safe system, but anytime it’s stressed like this, where you have controllers who are feeling under maximum pressure, it impacts safety – and people have a right to be concerned,” Goelz told CNN.
“You cannot expect humans to function at their highest level for sustained periods of time with this kind of pressure on them.”
This story has been updated with additional information.
See Full Web Article
Go to the full CNN experience
© 2025 Cable News Network. A Warner Bros. Discovery Company. All Rights Reserved.
Terms of Use | Privacy Policy | Ad Choices | Do Not Sell or Share My Personal Information
--------
Tumblr media
Dontcha feel just...so very safe?
9 notes · View notes
lobautumny · 2 years ago
Text
So like, there's some really shitty video that this toy saw a while back about QoL mods in Terraria and how if you install all of them and then crank all of their settings up to the maximum, then the game basically plays itself. The whole video was weirdly hostile and vindictive and effectively just made fun of the concept of QoL features/mods as a whole. But it stuck in this toy's mind, not because the video itself holds any value, but because the core topic of how quality of life & accessibility features have a tangible impact on a game's design is really interesting and nobody talks about it with any kind of nuance.
So like, Terraria is obviously a very different game from what it used to be. But all of the raw content (hardmode, bosses, biomes, weapons, NPCs, etc.) that always gets the spotlight in updates only makes up a relatively-small portion of that outside of, like, the tinkerer’s workshop from 1.1, and damage classes being added in 1.0.6, both being relatively-early additions. The plethora of things that were changed/added to make the game look nicer also aren't the core thing responsible, obviously. So what is the biggest reason modern Terraria feels so alien when compared to 1.0.X versions, or even 1.1?
It's the quality of life features. Inventory management got exponentially easier/more efficient, you have a minimap at all times, smart cursor lets you expend far less effort mining and dealing with backwalls, there are special equipment slots for grappling hooks and light pets, grappling hooks are bound to a hotkey instead of being an item that you need to manually select and use, you can use items directly from your inventory instead of needing to place them in your hotbar and then select that hotbar slot, you automatically walk up 1-block inclines and open/close doors as you walk through them, there’s a plethora of features to make getting around the world trivial, the start of the game moves way faster due to the player getting access to better equipment faster, block-swapping exists… This toy posits that this is all why Terraria feels like a fundamentally different game. In old versions, it felt like you had to fight tooth and nail to get anything accomplished, but nowadays, everything feels all buttery-smooth. The main friction you encounter in progressing through the game is with boss fights, as Re-Logic obviously intends.
Now, obviously, it would be insane and stupid to claim that Terraria is a worse game, right now, than it was all the way back in the 1.0.X era, and it would be even stupider to claim that it’s worse because it has QoL features. However, this toy does not believe that every single QoL feature added to the game was inherently objectively positive or correct from the game's inception. Rather, they were natural, smart conclusions for Re-Logic to come to with the direction they decided to take the game in as it continued development. But this was not the only direction Terraria’s development could have taken.
There’s a very unique feeling to old-ass Terraria versions, and it sucks that tracking down and playing these versions is so goddamn hard. You only ever have a vague idea of where you are because there’s no map to use as reference so you’re heavily encouraged to keep most of your stuff on the surface, and to build infrastructure to connect important things underground/in the sky so you don’t get lost. Everything is so unwieldy that building a simple house and making it look remotely nice feels like a herculean effort, enemies kick your ass way harder earlygame due to decent gear being much harder to access, and there’s a lot more gravity to the choices you make in what gear you use, because it’s a lot harder to hot-swap your armor and accessories when you're not actually at your base, which is harder to get to/from due to the world being far more difficult to navigate, as a whole.
This all leads to an exponentially slower game than modern-day Terraria is, where every single thing you do needs to be deliberate and well-thought-out, and everything takes a much longer time to do. This toy remembers spending weeks as a kid building housing for the meager number of NPCs that were in the game back then, alongside farms for all of the potion-making herbs and a big obsidian generator, and all of that could be accomplished in a single play session in 1.4.X.
There is a universe in which Terraria saw minimal QoL updates and instead leaned really hard into this direction, making a slow, exploratory game where the player’s power level very slowly increments upwards and you’re encouraged to build largescale infrastructure rather than the (relatively) fast-paced boss rush where your power balloons out of control immediately and your infrastructure is a fast-travel teleportation network that takes minimal effort to set up that the game currently is, and that version of the game would not have been wrong, inherently. It would’ve been more niche, for sure, but it wouldn’t have necessarily been bad, or even worse than the current game is.
This is what makes this toy sad that old Terraria versions are so difficult to get ahold of, as well as what fascinates it so much about the retro Minecraft community. Speaking of, let’s switch gears and talk about Minecraft for a bit.
Minecraft, as it’s sure most of the people reading this are well-aware, has recently been having something of a renaissance in its retro community, the people who prefer alpha and/or beta versions of the game to the modern game. A handful of complete overhaul mods have come out for these versions (notably, Better Than Adventure and ReIndev) that put interesting spins on the game’s design, basically asking the question, “What if Mojang decided on a different direction for Minecraft to take from this point in time?”
A lot of these mods cast aside the instant-gratification convenience and linear progression of modern Minecraft in favor of slower-paced, more survival-ey gameplay, placing more emphasis on the act of exploring your world and gathering resources as the core gameplay loop as opposed to… Well, modern Minecraft really doesn’t have much of a core gameplay loop to speak of, and that’s sort of the problem, now isn’t it? This toy doesn’t want to get too far into all of this, though, as its thoughts on Minecraft’s game design are not the focus of this essay. Rather, it wants to put the spotlight onto Minecraft’s community.
An ever-increasing number of people have been growing more and more critical of Minecraft over the last 5 or so years. It’s obviously always had its detractors, but in recent time, there have been more of them that have gotten more vocal, and it’s become pretty normal to have the take that Minecraft has been getting worse lately. And a big culprit that people keep pointing to is QoL. One of the most common criticisms of Minecraft online is that quality of life features have made it way too easy to trivialize the process of blasting through the game’s content, getting obnoxiously overpowered enchanted diamond (or netherite) gear, reaching the End, and getting access to elytra and shulker boxes.
Despite both being excessively popular games that have been made far easier through their QoL changes and overall polish, that have both been in constant development for over a decade at this point, the critical responses to those features in Terraria and Minecraft could not be more different. This is amusing, and gets at something deeper with regards to game design that this toy doesn’t know it’s ever heard anyone actually say: Quality of life features are fantastic tools for reducing the noise that gets in the way of a game’s vision, but when you add them haphazardly and/or with no real vision for what you want your game to be in the end, you can very easily wind up accidentally removing a large portion of what could’ve otherwise become compelling parts of your gameplay loop. They need to be used intelligently, or they can, in fact, harm your game and make a significant contingent of your playerbase enjoy it less.
79 notes · View notes
shroomie-23 · 11 days ago
Text
Titan army week, Day 2 Poison/Betrayal
[This one comes in three parts, part three can be disregarded if your a fan of bad endings, enjoy!]
Part one
The sun was already high in the sky by the time Alabaster finally got out of bed. It had been an hour since he awoke and even longer since Ethan had gotten him. 
The morning passed quickly with it being Alabaster’s turn to watch his youngest sibling, Carnelian. Breakfast was handled by Ethan—who was always more cheerful before noon and far too willing to make an extra helping of eggs for the energetic little demigod bouncing in their seat.
The war had ended almost a year ago, but already a complete shift among the world of Greek divines was already underway. Gods were dragged out of their thrones and new ones were erected, entire systems that hadn’t seen variation in centuries were going through the process of change in a matter of days. 
Finally, a treaty between monsters and demigods was made, there was no reason for them to go after demigods when they had around-the-clock access to their parents. 
Negotiations were made and New Rome was allowed to continue as its own nation once they ‘accepted’ the new system. Alabaster was still salty about their resistance but a contract was made so there wasn’t really much more to say. 
Camp Half-Blood dispersed before any actual agreement was made, but the witch doubted those kids were in any rush to fight another war, especially for a lost cause like this.  
The titan army itself became less of an army and more of a company, everyone had a job and everyone was rewarded for it. Most half-bloods were employed by their parents or by the titan equivalent, others worked simple guard jobs or did odd jobs and gigs here and there to stay afloat, some worked under Kronos himself running every thing, and the rest were content to rejoin the world of mortals and receive monthly payments for being a soldier in the first place.
Regulations were put in place to make sure any and all new half-bloods were taken care of and registered, as well as brought into the divine workforce when they came of age.
Alabaster and Ethan worked directly under Kronos, managing the titan’s time as well as relations to all other divine institutions. Most of that work however was done remotely and electronically, leaving the two to live in a simple apartment in Manhattan, spending most of their time volunteering or keeping connections or simply following up on things that interested them. 
It was a good life and Alabaster was happy to have it. 
Despite all the history being made the mortal world remained much the same, sans the casual major god in disguise going around to make new demigods. So when Carnelian asked to go to the playground, the couple didn’t hesitate to take him. 
It was a sunny friday morning, meaning the playground was completely empty. 
Alabaster sat cross-legged at the edge of the long bench, the paperback spellbook he was pretending to study perfectly still in his lap. 
A dying patch of dandelions sprouted near the swings, where Carnelian had decided to stage a one-kid siege against his much larger brother-in-law.
“Do I let them win this time?” Ethan asked, half-heartedly defending himself from Carnelian, who was currently grinding like a maniac and using a chipped foam sword he had begged for on his birthday to strike at his enemy with a theatrical menace he shared with his big brother.
Alabaster didn’t look up from his book, the small smile on his face the only indicator he was paying attention. “That depends. Can you handle the tantrum if you don’t?”
Ethan groaned. “Fine. Long live the tiny tyrant.”
Carnelian shrieked with delight as Ethan dramatically collapsed onto his knees, clutching his chest like he'd been stabbed through the heart, the foam sword tucked harmlessly under his armpit, mulch softening the impact. 
Alabaster just chuckled at the show, finally closing his book to help his lover up from the ground. 
“Well, since you were able to defeat Ethan so easily I don’t think you’ll have too much trouble taking us both on.” 
The ‘fight’ went on for less than half an hour before Ethan and Alabaster sat down on the slide, letting Carnelian run off to do whatever he pleased within the confines of the rusting playground. 
“I give it ten more minutes before they try to climb the fence,” Ethan murmured.
“I give it less for them to give up and try summoning something not meant for the mortal plain to aid them.” Alabaster replied, having reopened his book with one hand while keeping his other curled around his lover’s. 
The playground was silent save for Carnelian huffing and kicking the green wire encircling the park, it was almost peaceful in a way, like the whole world was taking a breath. 
Then the air shifted.
The wind stilled. The smell of warm blood drifted down from above.
A shadow passed overhead—too large to be any normal bird–too small to be any sort of aircraft. 
The vulture descended in a slow spiral, wings outstretched like a cross. It landed atop the monkey bars with a heavy rustle, talons clicking against the metal, a simple parchment scroll tied to one of its legs. 
Now it truly was silent, even Carnelian was watching with wide eyes. 
Ethan was the first to move, walking slowly until he reached the godly messenger, bowing his head before gently taking the scroll and sending the bird back into the sky. It was warm. Still pulsing with Kronos’s seal, wax dripping like blood. 
Alabaster appeared beside him without a sound, spindly fingers ghosting over the front. The seal was unmistakable—molten gold stamped with the hourglass symbol of Kronos. 
He unrolled the parchment with bated breath.
It was invitation to a banquet, set for Sunday, celebrating the one-year-anniversary of the titan’s victory, it invited[demanded] ALL half bloods to attend. 
Well, at least Alabaster knew what they’d be doing this Sunday 
Part two
Time passed differently up on what used to be Olympus, it flowed as Kronos willed it, adjusting and adapting to his every want and need.
He was a busy man, with an empire to run, nothing got done without his approval, every move made was made under his watchful eyes . 
Kronos had won, and now he was reaping the benefits, he and siblings ruled above the Greek pantheon. The other immortals were predictable, powerful but predictable. He was able to control them, and that was what mattered. 
There were very few things Kronos couldn’t control, and after Sunday that list would shrink to near zero. 
See, while half-bloods were a useful tool–decendent of deities with the power to roam and meddle with wherever they pleased, they were also extremely tedious to work with and an impossible threat if they ever unionized. They made for an amazing army, but now that there was no need for an army, there was little reason to keep them around.
Not as they were, anyway.
Demigods at their core were simply just disasters waiting to happen, beings of immortal power raised as mortals, fed the same education and ideals as the average insignificant person walking the world of mortal history and rebellion. 
Their logic was human, their morals were human, and that was simply just not something Kronos could keep neatly contained like he wanted. 
They had been tolerated because they were useful. Now they were a liability. And Kronos did not believe in liabilities.
So he’d just have to start over, all the younger impressionable half-bloods were already registered and ready. Ripe fruit for him to pluck and mold to his desire. 
All he needed was to safely dispose of older ones. 
A slow solution was required. One that would not cause unrest or the possibility of rebellion. One that would render gods and mortals alike witnesses, left in the dark until the full severity hit and there was nothing more to be done.
The poison was divine in that way. An honestly genius idea courtesy of only himself. 
Rabies—not the mortal strain, but its ancient, purer form, a madness born from Tartarus and tempered by Algae's touch. It would unravel the mind with grace, dulling memory, stifling thought and stirring violence in subtle waves. 
Slowly chipping away at the affected's consciousness until nothing but a feral husk remained to be cut down by their own devastated kinsmen. 
The liquid death would be spread through the wine, ensuring any demigod who was old enough to drink it would be tainted. 
The thought made Kronos smile. 
The preparations were complete. The wine had already been decanted. The chalices set. 
Soon everything would fall into place, and the entire pantheon would sit nicely in his palm like it was always meant too. 
Part three/happy ending 
Prometheus had always been good at watching from the margins.
That was his gift, and his curse—he saw storms before they formed, consequences before they arrived, and threats before they had names.
So when Kronos began planning the banquet, Prometheus watched. 
When the wine was decanted in secret, when the poison was poured in silence, when the servants were dismissed and the toasts were written, he watched.
And when Kronos turned his back—just once, just long enough—Prometheus moved.
He didn't need to touch the wine to know what brewed inside it. The divine essence hummed wrong in his bones. Rabies, the old kind. The mind-rotting, soul-shattering, unraveling kind.
Prometheus, chained once for his love of mortals, titan of forethought and crafty counsel, refused to let the soldiers that freed him from his punishment fall victim to his brother’s possessive greed. 
He worked with the same precision that once taught humans to create fire and make it their own.
He replaced the rabies infused wine with another wine tainted by a poison of his own making. 
A fast-acting, non-lethal purging draught. A concoction that would trigger fever, vomiting, and a temporary shutdown of minor magical channels. Unpleasant, but harmless.
The sickness itself would do much physically, but mentally it would inform the half-bloods that someone had tampered with their wine. 
Any demigod who drank would fall sick almost instantly. Not to die, but to grow suspicious and eventually rebellious. 
Prometheus wiped the rim of the chalice clean and left the banquet hall just as silently as he'd arrived, the fire in his chest burning brighter than it had since peace was instated over the pantheon.
Let Kronos toast.
Let the tyrant raise his glass and call it peace.
Because when the first demigod clutched their stomach, and the second collapsed to their knees, and the room erupted in confusion and fury—
They would know.
They would know he had tried to kill them. 
And knowing his clever creations, they would rebel with vengeance sweeter than the wine that had been made to end them. 
@titan-army-week
4 notes · View notes
inspofromancientworld · 1 month ago
Text
Brazilian Cave Paintings
Tumblr media
Source: Google Maps
Near Rio de Janeiro, Brazil, in the Itatiaia National Park, researchers have found cave paintings in the Serra da Mantiqueira formation. The researchers are also looking for more evidence of human habitation around this site, and considering how people lived in this area and what resources they had access to as it is now relatively remote.
Tumblr media
Source: https://agenciabrasil.ebc.com.br/en/educacao/noticia/2025-04/cave-paintings-discovered-rio-de-janeiro-park
The paintings were created sometime between 3,000-2,000 years ago, though research is only beginning. They were discovered by chance in 2023 when a park worker was on a climbing trip, drawn by a grouping of red lilies. He initially thought the paintings were 'graffiti left by tourists'. When he realized there were no names and dates, things usually left behind by tourists, he 'realized it could be something very old.' After taking pictures, he advised the Chico Mendes Institute for Biodiversity Conservation which manage protected areas in Brazil, leading researchers back to the site, 'a moment of great joy. For me, it felt like discovering it all over again,' he said.
Tumblr media
Source: https://agenciabrasil.ebc.com.br/en/educacao/noticia/2025-04/cave-paintings-discovered-rio-de-janeiro-park
One of the researchers noted that they were 'surprised to come across an entirely new site…it's not hidden away at the top of a peak where only a few mountaineers go. It's in an accessible area…I've hiked there myself.' Another researcher noted that much of the research around Rio de Janeiro had been focused on the coastal areas, leaving the 'interior and its divers cultural expressions…overlooked'.
Tumblr media
Source: https://agenciabrasil.ebc.com.br/en/educacao/noticia/2025-04/cave-paintings-discovered-rio-de-janeiro-park
This site is relatively small and with it being on a hiking trail, researchers are concerned with vandalism or the site being dug up by the curious, compromising the ability to understand the chronology of the site through careful research. They do not yet know if this site is connected to groups from modern-day São Paulo, Minas Gerais, or others from Paraiba Valley. To protect the site, the park had cordoned off the area and anyone who disregards the barriers faces heavy fines. At present, researchers are focused on preservation of the site and do not have plans to open the site to visitors.
4 notes · View notes