I am a self-taught developer with knowledge of Azure, C, C++, C#, Typescript/JavaScript, HTML and CSS along with several databases. Having been working in IT for over 30 years and co-founding and selling a successful company I am now working on products for the TV industry.Follow me on twitter @coderanger, Mastodon @coderanger/@dotnet.social and GitHub
Don't wanna be here? Send us removal request.
Text
Powershell Tips
As I am forever forgetting things; I thought this would be the best place to put these tips, for myself and anyone else.
Renaming files in a folder
Get-ChildItem -Filter "*1x1.png" | Rename-Item -NewName {$_.name -replace '1x1.png','1x1-light.png' }
(Add -recurse to the Get-ChildItem call to recurse through sub-folders).
Deleting files in a folder
Get-ChildItem -Filter "*1x1.png" | Remove-Item
0 notes
Text
ASP.NET Core Onboarding Woes
I thought I would finally do a bit of dabbling in ASP.NET Core and boy the onboarding experience is something isn't it.
Firstly I had decided on a server side rendered experience instead of a SPA, and as I love TypeScript, that is what I will be doing any client-side scripting in.
I had already decided on only modern browser support and I dont want to complicate things with a slow and painful bundling experience using rollup or webpack which makes debugging an awful experience.
ASP.NET Core
So the first issue I noticed was the ugly URLs; capitalised url parts ... really, urgh! I understand this is based on the file names, but I don't want to rename the files to all lowercase as thats not the .net way.
After a fair amount of digging this is resolved with a routing option in your Startup.cs ConfigureServices method:
services.AddRouting( options => { options.LowercaseUrls = true; } );
Why this is not the default I have no idea?!
Typescript
Ok, so now onto adding in TypeScript; again why is this not already setup in the default templates ... maybe then they would have resolved all the pain which took many hours of messing about trying to resolve.
Also as I only want to target modern browsers (Edge, Firefox, Chrome, Safari) I want to be abke to use the latest features like modern Modules support and so on.
Ok, so this took me a while of mucking about and working around, but it seems that once you add in the TypeScript MSBuild package, Visual Studio 2019 automatically looks and finds tsconfig.json files ... however what I wanted (and is normal) is to have a production and development configuration so that production does not include comments or map files.
After trying csproj conditions (which didn't work and gave build errors), extending files in separate folders which also didn't work, the only solution I found (so far) was to have the following setup, which I am not against, albeit not ideal:
tsconfig.base.json - this contains my base options, include/exclude directories, module settings etc
tsconfig.debug.bak - this extends the base and contains options specific to debug (see below)
tsconfig.release.bak - like the debug.bak but with release options
tsconfig.base.json
{ "compilerOptions": { "target": "ES2020", "module": "ES2020", "moduleResolution": "Classic", "lib": [ "DOM", "ES2020" ], "noImplicitAny": true, "noEmitOnError": true, "alwaysStrict": true, "outDir": "wwwroot/js", "allowUmdGlobalAccess": true, "forceConsistentCasingInFileNames": true }, "include": [ "scripts/**/*" ], "exclude": [ "wwwroot/lib/**/*", "wwwroot/js/**/*" ] }
tsconfig.debug.bak
{ "extends": "./tsconfig.base.json", "compilerOptions": { "removeComments": false, "sourceMap": true } }
tsconfig.release.bak
{ "extends": "./tsconfig.base.json", "compilerOptions": { "removeComments": true, "sourceMap": false } }
The last piece of this little puzzle is to set a Pre-Build Event to rename the debug/release based on the current configuration:
del "tsconfig.json" copy "tsconfig.$(ConfigurationName).bak" "tsconfig.json"
All the above now allows you to have modern TypeScript using imports in an asp.net core project.
The only caveat (which again is odd that there is no option for) is that your import statements need to have .js added to the module name. This works in both TS compiles but also in the browser; and to include the main script as a module.
Here are some examples.
app.ts
export class App { constructor() { } public startup() { // Initialise and start our application } }
site.ts
import { App } from './app.js'; $(document).ready(() => { const app = new App(); app.startup(); });
_Layout.cshtml
<script src="~/js/site.js" asp-append-version="true" type="module"></script>
I hope this helps someone who might be discovering the same points as me.
Now available as a free Visual Studio Extension
0 notes
Text
Forcing Semantic Release Compatible Commits
We use the great semantic-release system for our Node projects at work, which is great for automating the version numbers and change logs.
The way it works is that you format your commit messages in a particular way to mark the changes as a 'fix', 'feature' or 'breaking change' ... when semantic-release is run, it will determine what version numbers need increasing and will also generate a change log.
However, its easy to format the message incorrectly or forget completely, which will cause all sorts of issues when you try and do a release.
So I created a simple git hook to check my message format:
#!/bin/sh # Config options min_length=4 max_length=50 types=("feat" "fix" "perf") # End config options regexpstart="^(" regexp="${regexpstart}" for type in "${types[@]}" do if [ "$regexp" != "$regexpstart" ]; then regexp="${regexp}|" fi regexp="${regexp}$type" done regexp="${regexp})(\(.+\))?: " regexp="${regexp}.{$min_length,$max_length}$" function print_error() { echo -e "\n\e[1m\e[31m[INVALID COMMIT MESSAGE]" echo -e "------------------------\033[0m\e[0m" echo -e "\e[1mValid types:\e[0m \e[34m${types[@]}\033[0m" echo -e "\e[1mMax length (first line):\e[0m \e[34m$max_length\033[0m" echo -e "\e[1mMin length (first line):\e[0m \e[34m$min_length\033[0m\n" } # get the first line of the commit message INPUT_FILE=$1 START_LINE=`head -n1 $INPUT_FILE` if [[ ! $START_LINE =~ $regexp ]]; then # commit message is invalid according to semantic-release conventions print_error exit 1 fi
You can save the above script into your .git/hooks/ folder for any repos which exist already; or you can run the following commands that will set things up for every new repo you init in the future:
git config --global init.templatedir '~/.git-templates' mkdir -p ~/.git-templates/hooks cp commit-msg ~/.git-templates/hooks
0 notes
Text
Using nvmrc on Windows
Unfortunately nvm use on Windows does not change the node version to that specified in the `.nvmrc` file as its not supported on nvm for Windows.
So the easiest solution to this is to create a simple Powershell command that performs an approximation of the command which is to switch to the version specified and, if it doesn't already exist, download and install it first: nvm use $(Get-Content .nvmrc).replace( 'v', '' );
However, thats a bit awkward and we can do a bit more so instead, we can create an 'alias' to a function that calls the command instead:
function callnvm() { $versionDesired = $(Get-Content .nvmrc).replace( 'v', '' ); $response = nvm use $versionDesired; if ($response -match 'is not installed') { if ($response -match '64-bit') { nvm install $versionDesired x64 } else { nvm install $versionDesired x86 } nvm use $versionDesired; } } Set-Alias nvmu -value "callnvm"
Now we only need to type nvmu in a project folder for it to work properly.
However, this will only work for the current session, so to make it more useful for any project and every session; we can add this content to the Powershell Profile for the current user.
You can get the location of this file by typing $profile in a Powershell session and either edit or create the Profile file and place the content shown above into it.
0 notes
Text
Converting Azure Pipeline with Task Groups to Yaml
We have a fairly complex pipeline which builds, tests and deploys our ASP.net MVC app to an Azure WebApp in an App Service Environment. Because we have several high profile customers we actually deploy the app to separate web apps for each customer 'instance' so they have database and application isolation.
Because each customer instance is identical except for some App Settings to point to a separate database; deployment is the same except the web app location. Currently we have a Task Group with parameters setting the name of the instance (for the Task Display Name), the app location and the staging url so we can run tests.
I would prefer to use the new YAML pipeline for this app so its easier to add new customer 'instances' in the future and we can source control the pipeline.
After some investigation, I discovered I can pass parameter 'objects' into a template YAML file to pretty much do what I want; the only tricky bit was to have multiple properties per instance parameter 'object' and using the new template {{ each }} expression.
Below is how I constructed my yaml files for this solution.
azure_pipelines.yml
pool: name: Hosted VS2017 demands: - npm - msbuild - visualstudio - vstest steps: - template: azure_webapp_template.yml parameters: webapps: - name: Customer 1 url: customer1.azurewebsites.net - name: Customer 2 url: customer2.azurewebsites.net - name: Customer 3 url: customer3.azurewebsites.net - name: Customer 4 url: customer4.azurewebsites.net
As you can see above, we are creating an object webapps and then we have some nested properties for each 'webapp'.
Then in our 'template' we can iterate over each of the objects in the webapps parameter and expand the property in our iterated tasks.
azure_webapp_template.yml
# Proving ability to loop over params a number of times parameters: - name: 'webapps' type: object default: {} steps: - ${{ each webapp in parameters.webapps }}: - task: PowerShell@2 displayName: 'Task Group Test 1 ${{webapp.name}}' inputs: targetType: 'inline' script: | Write-Host "Name: ${{webapp.name}} with url ${{webapp.url}}" failOnStderr: true workingDirectory: '$(Build.SourcesDirectory)' - task: PowerShell@2 displayName: 'Task Group Test 2 ${{webapp.name}}' inputs: targetType: 'inline' script: | Write-Host "Name: ${{webapp.name}} with url ${{webapp.url}}" failOnStderr: true workingDirectory: '$(Build.SourcesDirectory)'
I hope this finds some use to others.
0 notes
Text
Azure Front-End Timeout
I experienced a particulary frustrating issue over the weekend with our Azure WebApp that took quite a while to find out the cause.
Basically, one of the functions of our WebApp generates Word and Excel documents with images and text, these can be pretty large and with images, can take anything from 1 minute to 10 minutes to create.
What we suddenly started experiencing with a new customer was at 4 minutes the Ajax call would complete with an error which contained a small HTML fragment:
<html><head><title>500 - The request timed out.</title></head><body> <font color ="#aa0000"> <h2>500 - The request timed out.</h2></font> The web server failed to respond within the specified time.</body></html>
This was odd, as the server was still busy downloading images and generating the documents in the background ... but how could that be when the request had timed out and closed the response.
All of which didn't happen when running locally; so after a lot of investigation we ended up deciding it wasn't our code and was an Azure issue.
What we ended up doing, was responding with a 'heartbeat' (in our case a carriage return) and flushing the buffer; then changing our JQuery Ajax call to move our client finalisation code from 'success' to 'complete' events.
This worked.
Azure support have just got back to me and said that this is 'by design' and the PaaS front-ends will kill the request with a 'timeout' error after 240 seconds. This timeout period cannot be adjusted and the solution is to re-architect the code.
I will now look into an improved architecture and possibly using WebJobs and polling to take this generation out of the request pipeline, which is a far better solution <inherited code disclaimer>.
1 note
·
View note
Text
Getting Office365 Planner Now
I was excited about Office365 Planner, but was dismayed that it will only be released in Preview to select customers in Q4 of this year
Urgh, then why announce it!! I will have long forgotten about it by then.
Anyway, I was taking a look at my apps, and I found the Planner icon was on my Apps list but only when in the Sway App. So you can use it now! Unless they take it away again soon.
0 notes
Text
Adding changes to the previous commit in Git
I end up needing to do this all the time, but thankfully its very easy to do; just stage the extra changes like normal and amend the commit:
Stage your missed changes:
git add .
Then just --amend the commit:
git commit --amend -m"Add new commit message which overwrites the previous one"
0 notes
Text
Is your hard disk activity continually at 100% in #Windows10? Disable the Superfetch service, Win10 tips and pre-fetching in Chrome.
— Craig Buckler (@craigbuckler) January 20, 2016
0 notes
Link
How to get non-high dpi aware apps to be more readable
0 notes
Text
Make one Branch the New Master
This StackOverflow Top Voted Answer worked brilliantly, the only thing I would suggest is to use the last bit so you can give it a comment; Kudos to Jefromi
0 notes
Text
Forcing HTTPS on ASP.NET
First, a bit of background; I am in the middle of converting over our main SaaS commercial web site from old Classic ASP with loads of COM objects to ASP.NET WebAPI running on an Azure Web App.
One of the things I have had to ensure is that the site auto-redirects to HTTPS, this is straightforward enough in an out-of-the-box MVC web site as all you need to do is add the [RequireHttps] attribute to your home controller. As our web site is a SPA, there are no other MVC routes to decorate.
public class HomeController : Controller { public HomeController() { } [RequireHttps] public ActionResult Index() { ViewBag.Title = "My Site"; } }
Now, there are several solutions to handle WebApi routes which fall into two-categories:
Forbid all non-HTTPS requests
Decide on a route-by-route basis which are permitted to be non-HTTPS
The first idea could be solved in several ways, but the second is much more flexible and more what I wanted to achieve. However, its easy enough by creating a new attribute which I can use to decorate the appropriate routes as needed.
Firstly create a new class in your app:
public class RequireHttpsAttribute : AuthorizationFilterAttribute { public override void OnAuthorization( HttpActionContext actionContext ) { if( actionContext.Request.RequestUri.Scheme != Uri.UriSchemeHttps ) { actionContext.Response = new HttpResponseMessage( System.Net.HttpStatusCode.Forbidden ) { ReasonPhrase = "HTTPS Required" }; } else { base.OnAuthorization( actionContext ); } } }
Now you can decorate your WebApi routes as necessary:
public class TestController : ApiController { public TestController() { } [RequireHttps] [HttpGet] [Route( "api/test/{id}/" )] public HttpResponseMessage TestHttps( int id ) { return Request.CreateResponse( HttpStatusCode.OK, "Hurrah" ); } }
0 notes
Text
Using Gulp with VS2015 Task Runner
There are several steps to getting Gulp running in Visual Studio 2015 with the new Task Runner.
Unfortunately I noticed that Javascript errors and possibly other issues do not show up in the Output window but you just have a "no tasks" message in the Task Runner ... so if your tasks do not show, try divide-and-conquer until you find the broken code.
Anyway, the first thing you need to do is ensure that your local copy of Node is being used above the one supplied in Visual Studio as it is likely to be out of date all the time, this isnt exactly necessary, but if you get odd issues, its worth a go.
Go to Tools->Options->Project and Solutions->External Web Tools and move your "$(PATH)" entry above the "$(DevEnvDir)\Extensions\Microsoft\Web Tools\External".
Now to get things installed and running we need to first install gulp globally from your command line:
npm install --global gulp
For every gulp task you want to use in your gulpfile you need to install into your project using the non-global install command. Here are some common gulp modules:
npm install path --save-dev npm install del --save-dev npm install gulp --save-dev npm install gulp-plumber --save-dev npm install gulp-util --save-dev npm install gulp-concat --save-dev npm install gulp-less --save-dev npm install gulp-imagemin --save-dev npm install imagemin-pngquant --save-dev npm install imagemin-pngcrush --save-dev npm install imagemin-jpegoptim --save-dev npm install gulp-eslint --save-dev npm install gulp-tslint --save-dev npm install gulp-less --save-dev npm install gulp-minify-css --save-dev npm install gulp-sourcemaps --save-dev npm install gulp-uglify --save-dev npm install gulp-watch --save-dev
Now you have gulp installed and the modules you want to use you can go ahead and create a simple 'gulpfile.js' in the root of your VS Project:
var path = require('path'), del = require('del'), gulp = require('gulp'), plumber = require('gulp-plumber'), concat = require('gulp-concat'), imagemin = require('gulp-imagemin'), pngquant = require('imagemin-pngquant'), pngcrush = require('imagemin-pngcrush'), jpegoptim = require('imagemin-jpegoptim'), eslint = require('gulp-eslint'), tslint = require('gulp-tslint'), less = require('gulp-less'), minifyCSS = require('gulp-minify-css'), sourcemaps = require('gulp-sourcemaps'), uglify = require('gulp-uglify'), watch = require('gulp-watch'); gulp.task('concat', function () { return gulp.src(['scripts/app/**/*.js']) .pipe(plumber()) .pipe(concat('./wwwroot/all.min.js')) .pipe(uglify()) .pipe(gulp.dest('.')); }); gulp.task('less', function () { return gulp.src('content/*.less') .pipe(plumber()) .pipe(less({ paths: [path.join(__dirname, 'less', 'includes')] })) .pipe(sourcemaps.write('./maps')) .pipe(minifyCSS()) .pipe(gulp.dest('./wwwroot/css')); }); gulp.task('imagemin-png', function () { return gulp.src('i/*.png') .pipe(plumber()) .pipe(imagemin({ progressive: true, svgoPlugins: [{ removeViewBox: false }], use: [pngquant(), pngcrush({ reduce: true })] })) .pipe(gulp.dest('./wwwroot/i')); }); gulp.task('imagemin-jpeg', function () { return gulp.src('i/*.jp*g') .pipe(plumber()) .pipe(imagemin({ progressive: true, svgoPlugins: [{ removeViewBox: false }], use: [jpegoptim({ progressive: true })] })) .pipe(gulp.dest('./wwwroot/i')); }); gulp.task('tslint', function () { return gulp.src('scripts/app/main.ts') .pipe(plumber()) .pipe(tslint()) .pipe(tslint.report('verbose', { emitError: false })); }); gulp.task('eslint', function () { // http://eslint.org/ var lintOptions = { "globals": { "jQuery": false, "$": true }, "rules": { "camelcase": 1, "no-comma-dangle": 2, "quotes": 0 } }; return gulp.src(['scripts/app/test.js', '!scripts/app/*.min.js']) .pipe(plumber()) // eslint() attaches the lint output to the eslint property // of the file object so it can be used by other modules. .pipe(eslint(lintOptions)) // eslint.format() outputs the lint results to the console. // Alternatively use eslint.formatEach() (see Docs). //.pipe( eslint.format() ) // To have the process exit with an error code (1) on // lint error, return the stream and pipe to failOnError last. .pipe(eslint.failOnError()); }); // Synchronously delete the output file(s) gulp.task('clean', function () { del.sync(['./wwwroot/all.*js']) }); gulp.task('default', ['clean', 'eslint', 'tslint', 'less', 'imagemin-jpeg', 'imagemin-png', 'concat'], function () { // This will only run if the lint task is successful... }); // These are 'watch' tasks which will wait until a file has changed which matches the source // path and run the array of tasks when required gulp.task('watchjs', function () { return gulp.watch(['scripts/app/*.ts'], ['tslint']); }); gulp.task('watchts', function () { return gulp.watch(['scripts/app/*.js', '!scripts/app/*.min.js'], ['eslint']); }); gulp.task('watchless', function () { return gulp.watch(['content/*.less'], ['less']); });
0 notes
Text
ASP.NET MVC Routing Issues
Here is an oddity when creating an ASP.NET MVC/WebApi; you can't have a route which contains certain words like "con" or "nul".
This is annoying when trying to create a REST based search API as, for example, if you wanted to search for an actor name as someone typed you would get a 404 error as soon as you typed "con" for someone like "connie".
There is a solution to this, just add the relaxedUrlToFileSystemMapping attribute to your web.config 'system.web' httpRuntime section:
<system.web> <httpRuntime targetFramework="4.5" relaxedUrlToFileSystemMapping="true"/> </system.web>
References: http://bitquabit.com/post/zombie-operating-systems-and-aspnet-mvc/
0 notes
Text
Separating a Subfolder into its own Git Repo
The Easy Way
It turns out that this is such a common and useful practice that the overlords of git made it really easy, but you have to have a newer version of git (>= 1.7.11 May 2012).
Prepare the old repo
pushd <big-repo> git subtree split -P <name-of-folder> -b <name-of-new-branch> popd
Note: <name-of-folder> must NOT contain leading or trailing characters. For instance, the folder named subproject MUST be passed as subproject, NOT ./subproject/
Note for windows users: when your folder depth is > 1, <name-of-folder> must have *nix style folder separator (/). For instance, the folder named path4\\path5\\subproject MUST be passed as path4/path5/subproject
Create the new repo
mkdir <new-repo> pushd <new-repo> git init git pull </path/to/big-repo> <name-of-new-branch>
Link the new repo to Github or wherever
git remote add origin <[email protected]:my-user/new-repo.git> git push origin -u master
Cleanup, if desired
popd # get out of <new-repo> pushd <big-repo> git rm -rf <name-of-folder>
Note: This leaves all the historical references in the repository.See the Appendix below if you're actually concerned about having committed a password or you need to decreasing the file size of your .git folder.
Clearing your history
By default removing files from git doesn't actually remove them from git, it just commits that they aren't there anymore. If you want to actually remove the historical references (i.e. you have a committed a password), you need to do this:
git filter-branch --tree-filter 'rm -rf <name-of-folder>' HEAD
After that you can check that your file or folder no longer shows up in the git history at all
git log -S<name-of-folder> # should show nothing
However, you can't "push" deletes to github and the like. If you try you'll get an error and you'll have to git pull before you can git push - and then you're back to having everything in your history.
So if you want to delete history from the "origin" - meaning to delete it from github, bitbucket, etc - you'll need to delete the repo and re-push a pruned copy of the repo. But wait - there's more! - If you're really concerned about getting rid of a password or something like that you'll need to prune the backup (see below).
Making .git smaller
The aforementioned delete history command still leaves behind a bunch of backup files - because git is all too kind in helping you to not ruin your repo by accident. It will eventually deleted orphaned files over the days and months, but it leaves them there for a while in case you realize that you accidentally deleted something you didn't want to.
So if you really want to empty the trash to reduce the clone size of a repo immediately you have to do all of this really weird stuff:
rm -rf .git/refs/original/ && \\ git reflog expire --all && \\ git gc --aggressive --prune=now git reflog expire --all --expire-unreachable=0 git repack -A -d git prune
That said, I'd recommend not performing these steps unless you know that you need to - just in case you did prune the wrong subdirectory, y'know? The backup files shouldn't get cloned when you push the repo, they'll just be in your local copy.
Credit
http://psionides.eu/2010/02/04/sharing-code-between-projects-with-git-subtree/
http://stackoverflow.com/questions/1216733/remove-a-directory-permanently-from-git
http://blogs.atlassian.com/2013/05/alternatives-to-git-submodule-git-subtree/
http://stackoverflow.com/questions/1904860/how-to-remove-unreferenced-blobs-from-my-git-repo
http://stackoverflow.com/questions/359424/detach-subdirectory-into-separate-git-repository/17864475#17864475
1 note
·
View note
Text
Merging Two Git Commits
Quite often I make a commit to Git and then realise I missed a change and have to make another commit; they both solve the same issue so I just want to merge them to tidy it up.
Here is an easy way, assuming you want to merge or squash, in Git palance, the last two commits:
# Reset the current branch to the commit just before the last 2: git reset --hard HEAD~2 # HEAD@{1} is where the branch was just before the previous command. # This command sets the state of the index to be as it would just # after a merge from that commit: git merge --squash HEAD@{1} # Commit those squashed changes. The commit message will be helpfully # prepopulated with the commit messages of all the squashed commits: git commit
I found this from one of the answers (not the accepted answer) on Stackoverflow
0 notes
Text
SQL Server Failed to Start on Reboot
I got an error the other night which frightened the life out of me after I restarted my SQL Server 2014 machine because of a Windows Update.
Basically it would not restart and the event viewer gave some obscure error message which didn't help. After finding and looking in the sql server log I saw the following messages:
2015-02-28 00:39:11.63 spid8s Starting up database 'master'.
2015-02-28 00:39:11.84 spid8s Attempting to allocate 6560086 BUF for buffer pool extension for a maximum of 6560085 page descriptors.
2015-02-28 00:39:11.84 spid8s Error: 864, Severity: 16, State: 1.
2015-02-28 00:39:11.84 spid8s Attempting to allocate 6560086 BUF for buffer pool extension for a maximum of 6560085 page descriptors.
After a bit of searching I came across this excellent article.
Basically, because I only had SQL Server Standard I find out that Buffer Pool Extensions is limited to just 4 times the Maximum Server Memory option. I didn't know this when it let me set any value I wanted -- bad show Microsoft!!
One of the first rules of application development is to not allow bad values; worst still there are so many ways they could have improved this:
Not allow the invalid value in the first place
If an invalid value is entered, limit it to the maximum value allowed, or use a default value
Turn off the feature if the value is invalid
NEVER EVER just stop the product working without any useful error message as to why!!
0 notes