#why do I need to use this horrible horrible operating system. let me use linux this is a cs course why can't I use linux.
Explore tagged Tumblr posts
Text
I hate windows 11 so much. So so so much. Worst operating system I've ever had to use. Leave me ALONE with all the popups and horrible programms I didn't install. I DON'T CARE ABOUT MCAFEE HOW ABOUT YOU WORK INSTEAD. And don't treat me like a mentally challenged neanderthal I have used a computer before.
It might've now broken my ubuntu installation, istg I will commit terrorism if it's not fixable. Who does it think it is to even fucking touch that partition.
#i'm about to throw my laptop across the room I've never ever been so mad#apparently 'critical error: your start menu isn't working. please log out' which is the ONLY action they allow after that.#except they do not fix it when I log in again. they also don't allow me to do anything else about it. i want to kill windows.#why do I need to use this horrible horrible operating system. let me use linux this is a cs course why can't I use linux.#personal
0 notes
Text
My horrible Odyssey to install Git LFS on my Synology NAS
So this is a bit different content from what I usually post. But I NEEDED an outlet for this incredible buildup of frustration that I had to deal with for the past WEEK. The objective seemed easy when I decided to start this terrible journey:
Install Git LFS on a Synology NAS disk station.
The Status Quo
My friend and I wanted to work on a Unity project together, so "normal" Git, I was told, would not suffice. It had to be LFS. We looked into other options before, but I thought "Hey, why not use the readily available NAS I have? Why use external, even paid services if we can just use a homebrew solution?" And that's where the descent into madness started.
I have a DS418j, not the most advanced or expensive model. It is important to note that I went into this as a completely clueless person.
I never used Git via console commands. My knowledge of Git was generally very limited. Even at the end of all this, I still BARELY know how to set up a repository and interact with it.
I had no idea what LFS even really does.
I only had very rudimentary knowledge of how to operate my NAS via its user interface.
I never interacted with a Linux console before. It turned out that I would be doing that a lot.
A Walk in the Park?
At first, everything seemed pretty straightforward, to be honest. I googled the simple task of setting up Git on a Synology NAS. The first result was already pretty helpful.
It seemed like all I had to do was set up a new user on my NAS ("gituser"), also install the available "Git Server" from the NAS's user interface. And "WebDAV", another package that could be installed via the interface.
WebDAV, as I found out, was a key component in my journey. It was a bit of a struggle to set up, but it appeared to be important in the process of connecting to my NAS via HTTPS. And probably other things that I still have no idea about. I didn't even know why I'm installing WebDAV in the first place, because I intended to use Git via SSH. Which another setting in my NAS would provide - the Terminal setting in the system settings. That's where I enabled SSH via port 22.
Well, my friend then told me that we cannot use LFS via SSH. Okay, I thought, that's what WebDAV is for, after all.
The Git Server had very few options, which seemed fishy to me. It literally only has one window where you set permissions for users. I gave gituser the permission to access it. That was that.
Of course I also needed a shared folder for our repositories ("git"). Creating that was not hard either. Here I noticed that gituser needs to be part of the "administrators" group for Git Server to work properly. But I could not remove it from the "users" group, so things got a bit fucky with permissions. I ended up giving both the administrators and users group a lot more permissions than I was comfortable with. But I trust my friend to not wreak havoc on my NAS for the time being.
So, everything was set up. Or so I thought.
Hitting the first Bump in the Road
I was able to connect to my NAS via SSH only, but didn't think anything of it yet. Doing that, I used Sourcetree to create the first test repo. Went to the NAS interface, checked, it was there and looked good. I could push and pull files. Created a second repo, ran "git lfs install"... and it didn't work.
The lfs command was not known.
I quickly found out that... of course it could not be known. Other than github for example, my NAS did not have LFS pre-setup. So, I concluded, I had to go install LFS on my NAS.
...Easier said than done.
While it does support a console with most regular Linux commands... a package manager is not included. At least none that is easily accessible, or supports any common packages.
At this point I figured "Why deal with this myself?" and contacted Synology support. I asked them "how can I set up Git LFS on my NAS?"
And Synology Support said:
¯\_(ツ)_/¯
They told me they do not offer support for the console. They said I should go ask the community. Okay.
I did not ask the community, but I did a lot of googling and found out: I could not "just install" LFS. I had to get creative.
We heard you like package managers?
First, I figured out that I need to be able to use the "curl" command. All the binary files on LFS's package cloud were binaries that were apparently downloadable with curl. I did not know what curl was... but I knew I needed to get it working.
I found out that for curl to work, I needed to install PHP on my NAS. Luckily, that was possible via Synology's included package manager. But for PHP to DO anything, I also had to install the "Web Station" and configure my PHP version there. I figured... might as well!
After enabling a couple PHP commands, I felt ready to download LFS. But the question was... which version? What even was my OS?
As it turns out, Synology uses a custom Linux version for their diskstations. Of course, LFS does not "officially" provide a package for that version. I tried the nodeJS version, because I noticed I also have nodeJS installed on my NAS. I ran into the version issue as well, unfortunately, when I tried to install the package through nodeJS. Not even changing my nodeJS version helped. Many hours later, I tried the .deb and .rpm files randomly instead of the nodeJS ones. Those also didn't want to work, despite me eventually figuring out how to lie to them about which OS I'm using.
I was almost ready to give up at that point. I was at least 3 full days into my odyssey already.
But then I spotted something else... A thing called "GoLang". Apparently, it would be possible to download LFS via GoLang. However, to do that, I of course needed to get Go first.
An initial search got me on track to "ipkg", which promised to enable me to install Go. But after reading up on it a bit, it looked woefully outdated. I had it already downloaded and was about to install, but ran into errors and trouble again.
That was when I found "Entware". It's similar to ipkg, but uses "opkg", a similar package manager. I was able to install Entware on my NAS without much trouble, and it contained the Go package that I needed so direly.
While I was at it, I also installed the available "git" and "git-http" packages from opkg, just to make sure. Who knew if they'd come in handy. I don't know if they did, but they also didn't seem to cause any harm.
Now, with Go installed (which went surprisingly smoothly), I was able to access just about anything on the internet and install it on my NAS! Hallelujah!
But if you thought it was over... just look at the scrollbar.
The end of my odyssey was finally in sight. I thought that nothing could go wrong anymore, now! With the help of Go, I was able to install the LFS binary. I was able to run it in my console, too. I was finally able to run "git lfs install".
...and it didn't help.
I got a bunch of errors, again. Instead of getting closer to the solution, it seemed like I just managed to find more obscure errors. Here the important thing about the SSH came in, as well. LFS does not like SSH, from what I found out. But SSH was the only way for me to connect to my NAS from my PC!
In a fury of looking up stuff again, I found the "DDNS" option on my NAS. That would allow me to get a hostname and use it like a normal website address! I kinda rushed through the setup because I had no idea what I was doing.
Well, whatever I did, it seemed to be sufficient. My friend could now connect to my NAS over her Sourcetree client. But when she tried to upload LFS objects, it just didn't work. She got a "404" error.
It wasn't hard to figure out that 404 meant no access in this case - my NAS was simply refusing to show any directory to someone who doesn't have access to it. Cue a long journey into my NAS's interface to make sure the gituser had the right permissions. I also changed the password because I read something about WebDAV being particular about some passwords. I also made a new user to see if maybe I just messed up setting up gituser.
To test if I was making any progress, my friend and I tried to access my NAS via our web browsers. But no matter what we tried, no matter what I tried, we couldn't access. 403 Forbidden or 404 Not Found. Those were the results. I couldn't even access if I used my admin account.
I tried to hack my way into pushing anyway, and only ended up corrupting our repo's history with "missing" and "broken" files because they were never properly uploaded, but LFS thought they were.
It should be noted that I had just accepted that HTTPS won't let me connect on my PC. So I had set up a hotspot for my mobile internet via my phone and used my laptop to do these things. I was in denial about eventually having to fix this, because I'm on a tight data plan on mobile and uploading and downloading Unity projects of several GB size wasn't going to happen that way.
Synology Support to the Rescue! ...Or?
It seemed like we had finally narrowed down the issue with our LFS upload attempts when I also checked the WebDAV console and it reported that it denied our login attempts through browser and Sourcetree as an "authorization failure". So something was wrong with WebDAV.
I contacted Synology support a second time. I asked them, "Why can't my friend and I connect to my NAS via the internet when I have WebDAV enabled and everything port forwarded?"
And Synology Support said:
¯\_(ツ)_/¯
They told me WebDAV and web browsers don't use the same HTTP and HTTPS methods. They are simply not compatible. They told me I should download their WebDAV client or connect locally.
So it was known from the start that what I was attempting could never work... but it was also not mentioned anywhere on the web or Synology's help pages that this was the case.
We have a saying in Austria: "jemanden deppert sterben lassen". It translates to "to let someone die stupid". Essentially, it means that you have information and you watch someone else without this information struggle without ever telling them about it voluntarily. I felt this saying was very appropriate for my situation.
Time to give up, I guess... Except I didn't.
I was almost a week into my odyssey by now. Maybe it's a sunk-cost-fallacy, but I couldn't abandon all my work now. I refused.
A Light at the End of the Tunnel
I went back to open another browser tab (my average was 20 open tabs during this... normally it's 2 to 3). And I searched for a solution that works with WebDAV. And truly... there was ONE repo online that offered it.
A special thanks goes out to this fellow: https://github.com/mpotthoff/git-lfs-webdav
They straight up saved our project from collapsing into a pile of tears and rage. I installed this package on my NAS, which... sort of worked. It turned out I needed to install it locally (as well?). So I did. But I needed to install Git, LFS, and Go on my local PC as well for that.
So with the help of Go, I built an exe file for my laptop, which then gave me a 401 when trying to push to LFS. Luckily I expected that. And I was overjoyed, because FINALLY a different error.
I tried to run the steps in the git-lfs-webdav repo to fix it... but got a strange error in the console.
It cried, when trying to enter my username, that the "handle" for the password was wrong. But I hadn't even entered the password yet! Searching some more on the internet gave me no conclusive answer. Randomly, I tried a different console - my Sourcetree console apparently runs "MINGW32" while my Git console runs "MINGW64". Switching to the Git console fixed this problem for me, and switching to the Windows shell fixed it for my friend.
And then, it finally worked for my friend.
She could upload a test image via LFS, and I could receive it via LFS on my laptop.
The rest was me calling my internet provider about my PC being unable to connect. The internet provider said ¯\_(ツ)_/¯.
Luckily I did not attempt to mess with my DNS or Subnet Mask or anything of the sort, or buy a VPN/Proxy. All I had to do was create a self-signed SSL certificate on my NAS, download it, and feed it into my PC's trusted authorities files. My friend had to download and feed it too.
In Summary...
This was a horrible, terrible, awful journey. I would not recommend attempting this at home to anyone.
Even now, what we've got going on only sort-of works with our Unity project, because Unity is... Unity. We're still figuring out the details, for example why scene content is going missing and so on.
But I believe that the worst and most difficult part is over. Will I be able to recreate this in a year or two, maybe for a different repo, on a different PC?
Probably not.
3 notes
·
View notes
Text
Stuffit expander mac 10.6

#Stuffit expander mac 10.6 for mac os#
#Stuffit expander mac 10.6 serial key#
#Stuffit expander mac 10.6 mac os x#
#Stuffit expander mac 10.6 mac os#
toast image should also preserve resource forks but you end up with much bigger files as there is no compression, or at least not as efficient compression. bin will preserve the resource forks and preserve file integrity when stored in a non-Apple file system. bin file formats are also not preferred though when done properly.
#Stuffit expander mac 10.6 mac os#
tar do not support Mac OS resource forks so don’t use those to archive classic data. You can grab the below mentioned resources from the Wired server. This article will go over the available options for creating such archives for each major OS version. Once the Classic file is compressed, it’s resource fork is preserved, letting you upload to an FTP site or Non-Mac computers safely. The resource fork is used mostly by executables, but every file is able to have a resource fork.įor example, uploading a Classic file or application directly to an FTP site or to a non-Macintosh computer will likely render the data useless.Ī good way to safely transfer Classic files is to first compress them using a resource fork friendly compression utility like DropStuff to create a. For example, a word processing file might store its text in the data fork, while storing any embedded images in the same file’s resource fork. A resource fork stores information in a specific form, containing details such as icon bitmaps, the shapes of windows, definitions of menus and their contents, and application code (machine code). When we deal with old Mac software or files, eg Classic (pre-OS X) there are resource forks to consider. But first, for those not in the know, why even bother? What makes Classic data so fragile? Below is a list of Mac OS versions and architectures and the best Stuffit for them. So I took what Steve talked about on Mac Yak and ran with it. The topic was “How to transfer files between old and new Macs” and got me thinking, how far back can we go to send files between different Macs and OS versions? What really is the best way to package this data to ensure it survives transfers to Windows PC’s, Linux servers over FTP etc? DVD include Intel Atom fixed kernel for 10.6.2 update, support for Intel Pentium 4 with Legacy Kernel, SATA Fix and more.Recently on Mac Yak episode #11 (which unfortunately got horribly mangled by YouTube so it’s not easy to watch) the topic of Classic data preservation and transfer was brought up. Support most of the modern hardware for Intel and AMD (AMD users need to patch cupid's with Marvin's AMD Utility).
#Stuffit expander mac 10.6 mac os x#
This Snow Leopard is made from Retail Mac OS X 10.6 with update's 10.6.1 and 10.6.2. Snow Leopard 10.6.2 MacOSX Intel/AMD-Hazard | 4.43 GB Mac OS X Snow Leopard 10.6.3: Install DVD (untested, DVD-ROM, 2010, multilingual).
#Stuffit expander mac 10.6 serial key#
Mac Os Leopard Dmg Torrent Mac os x 10.5 leopard install DVD – full iso image with Serial Key Mac OSX is the unique system that made his name throughout the world, Absolutely, remarkable articles on Apple’s list of 300 Plus peculiarities might resemble trivial, but if even a handful of them hit you where you live, that will be more than sufficient impulse for you to upgrade. Get a freeload for Operating systems software in the specialized download selection. The most relevant program for Snow leopard 10.6 dmg torrent is Mac OS X 10.6 Snow Leopard. The latest upgrade to Mac OS X Snow Leopard installation DVD which is available in.DMG format and can be made bootable. Download Snow Leopard Mac OS X 10.6.8 for free. Mac has introduced many operating systems, but this, Mac Operating System X snow leopard version 10.6 is best among all the operating system.
#Stuffit expander mac 10.6 for mac os#
Press the bottom button if you want to download Mac OS X Snow Leopard 10.6 free for Mac OS X. freeload Mac OS X Snow Leopard 10.6 Mac Dmg.

0 notes
Text
Linux Life Episode 35
Hello folks and welcome back to Linux Life. Well since last episode I have changed the i7 desktop version of ArcoLinux from Mate to Budgie.
This was because the way Mate was handling certain programs minimised was being a bit problematic. Some times programs would not reopen once minimised. Also because I have plans to dual boot the machine, I did not install this version via UEFI so it was causing issues with the Multi boot setup.
Now I admit the last time I used Budgie was with Ubuntu Budgie and the Software store app crashed every time I installed a program so it did not last very long on the test machine I used it on. However because ArcoLinux is based on Arch it’s primary source of installing software is through Pacman or Pamac. It still has the Software app but I will come back to that point in a bit.
Now Budgie is a complete lightweight rebuild based on Gnome 3 what has been changed I have no idea but lets just say me and Gnome 3 have not had the best relationship.
Antergos used it as its installer base and due to the issues with Cnchi it caused many a headache to get it installed. I also don’t like the huge icon launcher thing that looks like it fell out of the Fisher Price factory as it has no category sorting and you have to search through tons of icons to find what you want. OK there is a search bar but that’s fine if you know what the app you want is called.
Now Budgie still has this horrible icon launcher but because I use Cairo Dock I have moved the most used apps to it so I very rarely use the launcher and on my CD dock it has an Application Menu applet which has a categorised listing of programs should I need it.
Anyway installation was painless but when it loaded the first time after install I could not get my wifi to connect. It would not accept the key. I had entered it about five times and was getting frustrated. I pulled up the settings for wifi and sure enough it could see my router.
However I have no idea why but my keyboard layout was set to Belgian instead of UK English so it was set as an AZERTY keyboard with odd accents. Once I set the keyboard to the right language I could enter my wifi key and I could now connect to the internet.
ArcoLinux still won’t automatically sign you in even though I did set it in the installer options. Don’t know why that is but it’s not the end of the world.
Now because Budgie is pretty much Gnome 3 in sheep’s clothing the application menu was at the top like Mac OS. Now for the Mac that’s fine as you can use the icons but Gnome 3 does not have any icons on screen unless you add an extension to put them on screen.
Now this brings me onto extensions, why Gnome 3 does not keep all it’s settings in one place like everyone else I will never understand but the base settings is just rudimentary and if you want to change Appearance of things like themes you have to go to separate setting menu using the Tweaks app.
So I wanted the Application panel bar on the bottom of the screen I tried dragging it but no joy. It would not move. So I had to find an extension to do so this is where it uses the Software app. I had to install an extension called the MMOD loader then turn it on using Tweaks before I could move it. What a carry on.
I’m sure there was another way such as using the SUPER key which is apparently the Windows key and Shift probably but I never tested that method as I only learned that much later.
So now the Applications bar was on the bottom and the MMOD added the Favourites next to it which was handy, however when it moved down. Where in the top it said Applications so you could click it was now just a blank space.
After a bit of fiddling around looking in Tweaks and finding no joy I went to the internet and guess what it needs another extension of Application settings which gives yet a further setup panel.
However I managed to get an icon back and it allows you to customise the text it displays so not complaining. You can also change the icon used so now I have a metal Arch logo. Which looks quite good so bonus points there.
Now I wanted to change the look certain other bits such as the icons so I had to go back to Tweaks. Now I managed to change from the dark look to Adwaita which is a more standard white.
I know Dark Mode is the in thing due to Mac OS Mojave and at times I don’t mind it. However for all the theming changed the main bar was still black. Adwaita was not available for the Shell which is what the bar is known as.
So off I went to Gnome-Look.org which has the largest selection of Gnome themes including Gnome Shell. Every one in the list was either black or some weird colour I did find the Adwaita-White theme but the download button was disabled for it for some reason.
After a bit of searching about I settled on a theme called Square Glass which made the shell look like transparent glass. OK so after downloading I had to figure how to install the theme.
Now as I had User themes turned on via Tweaks apparently there should be a .themes directory in my home directory there wasn’t. Even when I created it and moved the theme in there no joy.
OK off to the Internet I trot. Eventually I had to log in as root move the extracted theme to /usr/share/themes and finally it worked. Only thing that is a little annoying is the shutdown buttons when you close the panel is transparent as it should be but the Shutdown, Restart and Cancel buttons are black.
I’m sure there is a way to change this as technically the theme is just a CSS file I believe so there is probably a way to change it. If you know how to alter them. I am sure there is a guide somewhere on how to do it. I will have to see if I can find one.
Since I was changing themes I decided to change the Grub theme as ArcoLinux does not use a graphical boot it’s a text one which looks rather lame. Gnome-Look.org has a selection of Grub themes and I now have a nice green Shodan for System Shock 2 ASCII style theme. Luckily Gnome 3 has a nice tool called Grub Customiser which makes it simple to change theme one you install them to /etc/boot/themes.
So that has been my journey through this change. Reason I discovered the SUPER key move trick was because when I installed Davinci Resolve 15 it did like the Mate version did and the application was off the end of the screen.
When you move the app using the SUPER key (their capitalisation not mine) method rather than dragging like any other it seems to jump in stages. Which is odd.
Now that I have it set up with all the extensions and I have managed to set the themes and stuff it’s actually working fine. Sure it still has that silly icon launchpad but due to me using Cairo Dock I very rarely have to use the silly thing.
So why it is known as Budgie I don’t know as it’s really just Gnome 3. However for all I have criticised Gnome 3 but it does the job once you get it how you like it. It seems pretty stable.
Anyway that’s enough waffle for this episode. Man this was a long one. So all I need to do is install the Hackintosh as the other operating system so that it is set up to dual boot.
If I get it working I may set up a new blog called Hackintosh Life to run parallel to this one so I will report the trials and tribulations of that there. So until next time ...Take care.
0 notes