Prepping to get hands-on experience with high availability

I currently have a small VM host with a 1TB HDD, 16GB of RAM, and a 3.3GHz CPU.  Today I ordered a second (almost identical) computer.  Last night I setup FreeNAS running from a 2GB USB flash drive on an old Pentium 4 PC with a 320GB HDD.  Today, configured it as an iSCSI target, to serve as a CSV (cluster shared volume) for my high availability education.

FreeNAS Quick Start Guide

 FreeNAS iSCSI documentation

I'd like to see if Jumbo Frames helps me out (with this $18 switch), but will have to wait on that for now.  I'd also like to use a pure SSD environment, which would let me saturate a 3Gb network link to my poor-man's SAN.  The point of all this is to be able to get MCSE 2012 certified.

How to enable Jumbo Frames in Server Core

I've learned that the GPT partitioning style supports drives over 4TB and more than 4 partitions (if you care about partitions anymore).


Monitoring (fragment)

I've been assigned to stabilize and document the IT infrastructure of a company with 35 workstations and several servers.  A 100Mbps wireless bridge connects two buildings.  Users in building 2 complain of slow network performance.  A monitoring system is needed.

My boss likes Zabbix; I tried it, but got lost in textual config files and endless customization options.

Today, I installed PRTG Network Monitor and like it a lot.  It gives you 10 sensors for free; 100 sensors cost a one-time $440 fee.  The only problem is the free version's fastest interval is 60 sec - the paid version's is 30 sec.  I'm looking for a 3 - 5 sec interval - from a free tool.

Solarwinds Bandwidth Monitor doesn't offer logging.
Flowalyzer offers logging, but it doesn't work.
STP works, but its internal math seemed wrong - due to my own error at the time.
...which brought me to the (yep, text-only) MRTG.

I needed to identify the correct OID to monitor: Flowalyzer provided this, and I found a list of Proxim OIDs at this link.

Misc Notes:
- To start the WinPcap driver, run "net start npf"
http://support.microsoft.com/kb/314053 - TCP Window size - bigger is better on slow networks
http://www.mikrotik.com/thedude - haven't tried it out yet.
https://www.untangle.com/store/web-filter.html - I'd like to try it out.


File classification + dynamic access control

This evening's study topic was file classification + dynamic access control.  The files that I want to classify are about dating, finances, travel plans, and correspondence.

Installed the FSRM to let me configure classification properties and rules for files.  Tried to make a property that would flag all files containing the names of several different women with the "dating" property, but you can't do that - you need a separate rule for each piece of text that you're searching on.  Bummer.

I think Microsoft wants me to classify entire folders with the FRSM - just like I would've done with an Active Directory security group - and then control access via DAC instead of security groups.  OK, makes sense.

The classification is stored in a file's NTFS alternate data stream, so the classification will stay with the file as long as that file lives on an NTFS formatted volume.  I can't find any GUI-based way to remove all file classifications, so I guess you'd need to copy your files to a FAT32 or ReFS volume in order to wipe out their classification attribute(s).

For Dynamic Access Control, be aware that Windows 8 supports device claims, while Windows 7 only supports user claims (I think the file server queries the domain controller for user claims on Windows 7, since I think Windows 7 doesn't directly support user claims).

TrueCrypt containers mounted as a drive letter do not support volume shadow copy.  However, an entire drive that is encrypted with TrueCrypt does support volume shadow copy.

After installing FSRM on a 2012 R2 server and enabling VSS (volume shadow copy) on a couple of VHD files that I'd created and mounted, I noticed that 600MB was immediately allocated by the OS (although not written, since the empty and thin-provisioned 2GB VHD file remained at 30MB on the host system).  Disabling VSS did not recover the disk space.  I'm thoroughly puzzled as to its cause.

Here's a link to How NOT to go overboard with classification.  Here's a link to an overview that mentions a deployment tool from Microsoft.

Now, the classification properties that you define inside the FSRM are local to that file server.  If you want consistency across multiple file servers, or you want to control access with these properties (just like NTFS or share permissions) then you need Dynamic Access Control, configured from the new Active Directory Administrative Center.  There, you define forest-wide resource properties for files/folders and/or "claims" that describe users by department, location, etc - attributes pulled from their Active Directory account details.  You can then make a rule that requires a user claim to match a resource property, package that up in a "central access policy" which gets deployed in a GPO and then you can manually apply that central access policy to the shares on your file server(s).

Experimenting with DAC:
- Enabled the built-in "Department" resource property.  This immediately became visible in the FSRM as a classification that is "global" in scope.  You can't edit it in the FSRM, because it's managed by the ADAC.
- A "Central Access Policy" exists solely to let you deploy one or more "Central Access Rules" to your file servers via group policy, instead of manually defining these rules on each shared folder of each file server...now you can simply point & click to enable the rule(s) on each shared folder of each file server.
- Central Access Policies become available for first application on a file server as soon as they are refreshed by group policy.
- Changes to Central Access Rules are effective immediately on all file shares that reference them. 

This is a cool way to manage file permissions!