Automatic Reconnection of iSCSI Targets in Windows 10 using PowerShell

When my highly recommended Synology Disk Station reboots for a required update (I’ve got it set to automatically reboot), a shared Windows 10 PC in our house cannot always successfully reconnect to the iSCSI targets without manual intervention. Unfortunately, I haven’t always noticed which has led to several features of Windows not functioning the way I want (I have mapped the iSCSI drives/disks via Windows and made them into network shares for the other PCs/laptops in our house — this way I can use Windows bitlocker encryption on the iSCSI drive contents).

To make the connection more automatic, I created a simple one line PowerShell script that periodically attempts to connect to any disconnected iSCSI targets using the Windows Task Scheduler.

I saved this into a script file called reconnect-iscsi-targets.ps1:

Get-IscsiTarget | where ($_.isConnected -eq $false) | Connect-IscsiTarget

Then, in the Task Scheduler, I created a new task set to run every 10 minutes daily. The script just gets all iSCSI targets, filters only those that aren’t connected, and then passes the results to the connection cmdlet.

For the action, I selected “Start a program” for program/script, I entered: “powershell.exe”, and then added the arguments “-File” and the full path to the file name, like:

-File c:\Users\aaron\Documents\reconnect-iscsi-targets.ps1

If there are spaces in the path to the PowerShell file, be sure to add quotes around the full path and file name.

You shouldn’t need the start in option set (leave it empty if you’d like).

On the General tab of the task, make sure you’ve set the “Run whether user is logged on or not” option and “Run with highest privileges.”

Next up — how to quickly create a Self-Signed Code-Signing certificate. And, how to actually allow scripts to run!

Frustrated by the DocumentDB Emulator

I was very excited about the announcement of a DocumentDB emulator!

I could finally explore the magic of this new document-based database (I can’t say “NoSQL” as it supports a SQL dialect) without spending money just to explore the database (it was a “pay to play”).

However, it’s not to be for me:

  1. Worst: It supports requests from LOCALHOST only. Its ports are bound to 127.0.0.1 rather than 0.0.0.0, thus preventing it from being usable from other machines. I wanted to install this on an always-on PC in our house rather than install it on my home workstation. I can sort of see why they want to limit it, but come-on, this is for development purposes. It’s not set up for availability, performance, etc. One simple right click of the app-tray icon resets and clears all the data. There are a number of effective ways for making it developer only — and I wouldn’t have picked this one.
  2. It has constantly used CPU, even when it’s not being used. On my laptop for example, it was hovering around 10-15% CPU, even when there were zero active connections. This problem isn’t consistent as I installed it on a secondary workstation and it’s using a lot of RAM (nearly 400MB), but only 1% CPU.
  3. It’s Windows only. This is actually a minor point if you’re using Windows, but if you want to use a Mac and do development, even with a VM hosted emulator, you won’t be able to.

So, for the best experience with DocumentDB, you’ll likely need to stick with the pay-to-play option of hosting it in Azure. I’m disappointed. I’ll look for a different database … something that doesn’t have this limitation.

Bummer.

Tree walking and display console app in Go

I hadn’t done anything at all interesting in Go. And some might say, I still haven’t. However, I wanted to do something that I’d find occasionally useful.

On Linux, this is already available, but the the Windows version is lacking. I wanted a tiny console app that would display a tree of the directory and file structure.

So, that’s what I built. Smile

It’s one small file with one external library dependency.

I did notice that the Go packages for directory scanning like ReadDir suffer from a common issue when using the traditional Win32 APIs: they do not adequately handle file paths longer than about 250 characters. When you have NodeJS source code on your system with lots of deeply nested paths, many Windows programs fail miserably when doing file/folder management operations (Windows Explorer, I’m looking at YOU). Using the one “easy” trick of prepending the path with \\?\, you can use the APIs like ReadDir reasonably reliably. Whereas in earlier versions of my tree app it would crash, it now can handle deeply nested directory structures.

image

go get github.com/fatih/color

Customize the External VS Code Console

I have Visual Studio 2015 installed and when I launch a command prompt on Windows, it’s generally the Visual Studio 2015 Developer Command prompt (in fact, I have it pinned to my Windows 10 task bar).

image

On Windows, using Visual Studio Code 1.7+, pressing SHIFT+CTRL+C opens a Windows Command prompt from the root directory of your opened folder. It’s super useful.

I wanted the Developer Command prompt to open instead of a standard command prompt.

To change that, open the User Settings:

image

Add the following:

"terminal.external.windowsExec": "C:\\Program Files (x86)\\Microsoft Visual Studio 14.0\\Common7\\Tools\\VsDevCmd.bat"

Of course, you can use other batch or command files to customize the prompt. I just wanted all of the standard developer tools I have installed to be available.

MobX and TypeScript Experiement

I wanted to give MobX a try, in particular from TypeScript.

Here’s my first attempt. I liberally used the documentation example found within createTransformer as my guide.


import "core-js";
import { observable, autorun, createTransformer } from "mobx";

/*
 The store that holds our domain: boxes and arrows
*/

class Store implements Storable {
 @observable public boxes: Box[];
 @observable public arrows: Arrow[];
 @observable public selection: any;

 constructor(init: Storable = {}) {
     this.boxes = init.boxes || [];
     this.arrows = init.arrows || [];
     this.selection = init.selection;
 }
}

interface Storable {
    boxes?: any[];
    arrows?: Arrow[];
    selection?: any;
}

interface Box {
 id: string;
 caption?: string;
}

interface Arrow {
    id: string;
    to?: Box;
    from?: Box;
}

const serializeState = createTransformer<Store, Store>(store => {
 return new Store({
     boxes: store.boxes.map(serializeBox),
     arrows: store.arrows.map(serializeArrow),
     selection: store.selection ? store.selection.id : null
 });
});

// copy using Object.assign (as this is just a simple JS object anyway)
const serializeBox = createTransformer<Box, Box>(box => Object.assign({}, box));

const serializeArrow = createTransformer<Arrow, Arrow>(arrow => {
 // or can copy manually...
 console.log("serializeArrow");  // this is only called 3 times!
 return {
     id: arrow.id,
     to: arrow.to,
     from: arrow.from
 };
});

const store = new Store();
const states: Storable[] = [];

autorun(() => {
 // this could be used to create an undo buffer, or whatever
 // probably wouldn't want infinite growth ... 🙂 
 states.push(serializeState(store));
});

const b1 = { id: "b1", caption: "Box 1" };
const b2 = { id: "b2", caption: "Box 2" };
const b3 = { id: "b3", caption: "Box 3" };
store.boxes.push(b1);
store.boxes.push(b2);
store.boxes.push(b3);

store.arrows.push({ id: "a1", from: b1, to: b2 });
store.arrows.push({ id: "a2", from: b1, to: b3 });
store.arrows.push({ id: "a3", from: b2, to: b3 });
b1.caption = "Box 1 - Edited";

// Should be 8
console.log(states.length);

b1.caption = "Box 1 - Final";

// Should be 9
console.log(states.length);

To make that work:


npm install --S mobx reflect-metadata
npm install --D @types/core-js

Sweet that MobX includes a TypeScript declarations file. 🙂

The things of interest here is that MobX assists in maintaining a stack of the object graph’s state, something that could be used for example in an undo buffer or comprehensive log system.

In this example, that’s done by using the MobX autorun functionality. When any of the dependencies of autorun changes, the function executes. In the example above, it makes a clone of the current store, using the createTransformer function, which turns a function into a reactive and memoizing function. Through memoization, it only transforms portions of the objects that have changed, not everything, every time. That doesn’t mean that you won’t want to limit the growth of the states, but you shouldn’t worry that a large complex object structure is being built with every change.

As TypeScript doesn’t support the object spread operator (which is convenient for making a clone of an object), I’ve used Object.assign instead (which may require a polyfill depending on the environment in which you use this code).