/* BeejBlog */

Windows 8 – my initial perspectives

Decent developer oriented introduction to the new Win8 app dev paradigm called “Metro”: “Metro style apps using XAML: what you need to know” (by Joe Stegman, Group Program Manager for the Windows XAML team at the Microsoft //BUILD/ Conference, Sep.2011, Anaheim CA)

  • "Metro" is the name they're putting on building applications that embrace the touch/gesture/handheld era we're in now.
  • MS marketing is drawing a clear distinction between this and the "old school" "productivity" oriented desktop apps... one key pattern they're throwing around to distinguish the two is that these kinds of apps are highly CONTENT focused … raw content consumes the screen real-estate, traditional buttons (aka “chrome”) are hidden... gestures rule... the ubiquitous demo is the gesture-nav photo album... thumbnails are all you see to start... you flip around in the gallery with touch gestures... drill into images with another gesture, yada yada
  • XAML still seems to be an active technology (which is a relief since i spent the last two years self educating there... I must say that if nothing else, databinding in XAML is definitely a strong evolution... i've already seen evidence of frameworks coming up after taking XAML as the current pinnacle from which to consider improvement)
  • HTML5 + JavaScript looks to be the new UI go-to - which is interesting and wasn't discussed so i still have a mental gap about what that really means.
  • significant: the .Net framework is now "native" and it's morphed into "WinRT" (Windows Runtime)... i.e. these APIs are now bundled directly within the OS just like we've been familiar with Win32 all these years... this was a very logical step and i'm glad they made it... but this leads to the next point...
  • Metro/WinRT is Win8 only
    http://social.msdn.microsoft.com/Forums/en-US/windowsdeveloperpreviewgeneral/thread/d4850eb7-5fb2-45b6-9e89-cd13056c4797
    this will create some inevitable app fragmentation in the medium term... yet another choice for the harried mobile developer... such is progress... on a personal level i'm not really worried about it... i like seeing the upgrades... pretty much at all costs.
  • the demo showed that basic Silverlight/WPF XAML syntax is mostly cut/paste compatible in "Metro" XAML app.
    The basic development approach of Metro is the same as SL/WPF.
    • But you can tell from the tone, that there have been some breaking tweaks...
  • and of course there's outright new controls to get cozy in this new content/gesture focused paradigm.
    • i liked the part of the demo where he showed how trivial it was to slap on some new "flipview" type controls that handle all the momentum style scrolling where the scroll naturally speeds up, slows down and jiggles when it stops... we don't have to code that... we just drop the equivalent of a fancy listbox on the UI, bind it to a list of data and it does all that fancy footwork... that's pretty cool.
  • One biggish thing they mention is that *everything* is now async... i know SL was already pretty much that way anyway... in WPF you had a choice... the basic motivation is that it leaves the UI responsive... i.e. no longer possible to create an app where the Window locks up and greys out with "(Not Responding)"... i agree that this is a good thing overall... i've been coding this way and there's some nice language facilities in C# that make it not that much different from sync programming from a code readability and maintenance standpoint (e.g. inline anonymous functions passed to an async context).
  • of course, all this goes back to whether you think you need to tie yourself to a Windows platform in the first place... there's still pure web driven apps via HTML5, etc representing a strong viable option.

  • from a consumer standpoint, it'll be cool to have a solid tablet oriented flavor of Windows knocking heads with the Androids and iPhones out there.
    • Windows 8 of course fires up with the Windows 7 Phone style "tile" based "home page".
    • And the familiar old Windows desktop/Start Bar is just a flip away if you need that.
    • they're saying Win8 will run on both ARM and Intel so the whole mobile hardware spectrum is fair game.
      • after a quick scan, I'm not the only one thinking about Win8 on a contemporary Samsung Galaxy tab.

Wading into MVVM

  1. Commands rather than Events
    • instead of event handlers, think in terms of firing commands that will find their way to the corresponding command properties declared on ViewModels
    • i.e. use Command attribute of XAML widgets rather than Click event handler
    • MVVM popularized class RelayCommand works well as a lightweight ICommand implementation to use on ViewModels
    • or barring that, use the Expression Blend Interactivity DLL to map click events to Model object methods via "CallMethodAction" behavior

SQL Server Table-Valued Stored Procedure Parameters <=> ADO.Net

Nutshell:
  1. Declare a User Defined Type (UDT)
  2. Declare a stored proc parm of that UDT
  3. Fill an ADO.Net DataTable with the same columns as the UDT
  4. Assign the DataTable to a Parameter of an ADO.Net SqlCommand corresponding to the sproc
Notes:

Code Examples:
  1. File_UDT.sql
    CREATE TYPE File_UDT AS TABLE
    (
      FullPath varchar(900) PRIMARY KEY, 
      ModifiedDate datetime, 
      [Size] bigint
    )
    GO
    
    GRANT EXECUTE ON TYPE::dbo.File_UDT TO PUBLIC
    GO
  2. Files_UploadCompare.sql
    CREATE PROCEDURE [dbo].[Files_UploadCompare]
    @BackupProfileID INT,
    @NextDiscNumber INT = NULL OUT,
    @AllFiles File_UDT READONLY -- <= *****
    AS BEGIN
            
    SET TRANSACTION ISOLATION LEVEL READ UNCOMMITTED;
    
    -- new approach, simply return all files which don't match something already in the database 
    -- then we don't have to worry about partial results left in the tables ... 
    -- we just upload the current batch of files when we're with each burn and then start fresh with the next batch selection from there
    -- there will be no records in FileArchive unless they've been put there specifically as marking a "finalized" MediaSubset
    
    SELECT *,
      CONVERT(BIT, 0) AS Selected,
      CONVERT(BIT, 0) AS SkipError
    FROM @AllFiles a
    WHERE NOT EXISTS(
      SELECT 1
      FROM FileArchive fa
      JOIN [File] f ON fa.FileID = f.FileID
      WHERE f.FullPath = a.FullPath AND fa.ModifiedDate = a.ModifiedDate AND fa.Size = a.Size
    )
    
    DECLARE @IncrementalID int
    SELECT @IncrementalID = MAX(IncrementalID) FROM [Incremental] WHERE BackupProfileID = BackupProfileID
    
    SELECT @NextDiscNumber = isnull(COUNT(1),0)+1 FROM MediaSubset WHERE IncrementalID = @IncrementalID
    
    END
    
  3. FileSystemNode.cs
    static private void ScanFolder(FolderNode folder, DataTable IncludedFiles)
    {
      DirectoryInfo dir = new DirectoryInfo(folder.FullPath);
      FileInfo[] files = dir.GetFiles("*.*", folder.IsSubSelected ? SearchOption.TopDirectoryOnly : SearchOption.AllDirectories);
      foreach (FileInfo file in files)
      {
        DataRow r = IncludedFiles.NewRow();
        r["FullPath"] = file.FullName;
        r["ModifiedDate"] = file.LastWriteTimeUtc;
        r["Size"] = file.Length; //megabytes
        IncludedFiles.Rows.Add(r);
      }
    }  
    
  4. MainWindow.xaml.cs
    using (Proc Files_UploadCompare = new Proc("Files_UploadCompare"))
    {
      Files_UploadCompare["@BackupProfileID"] = (int)cbxBackupProfiles.SelectedValue;
      Files_UploadCompare["@AllFiles"] = IncludedFilesTable; // <= ******
      WorkingFilesTable = Files_UploadCompare.ExecuteDataTable();
      lblCurrentDisc.Content = Files_UploadCompare["@NextDiscNumber"].ToString();
    }
Tips:
  • (from here): If the login that SqlCommandBuilder.DeriveParameters is run under does not have permission to access the UDT, no error will be thrown - the method will return successfully, but the SqlCommand.Parameters collection will not contain the UDT parameter.!!!
  • Granting permissions on a type (from here): GRANT EXECUTE ON TYPE::dbo.MyType TO public;
Links:

CAC (SmartCard) Enabling ASP.Net on IIS

  • The only configuration settings required are (IIS7 screenshots below):
    • Require SSL (this represents server side)
    • and either Accept or Require Client Certificates … “Accept” will populate the SmartCard’s cert info to your ASP.Net Request object (if it’s provided) but won’t deny access if one hasn’t been provided, “Require” will deny access unless a valid SmartCard Cert has been provided.

Tips:

  • One key thing to be aware of how this works is that the server will send a list of Trusted Root Certificates down to the client/browser and then the browser will compare that list to the Trusted Roots represented by the CAC present and only if there’s a match will it prompt for the Certificate and PIN input.  Therefore, both the Server and the client must have the same Trusted Root Certs installed for this to work, the easiest way to do this for the DoD CAC’s is to grab the latest install_root.exe and fire that up.
  • Another key thing I discovered was that after you get the certs installed, go ahead and do a reboot, I was still getting 403 access denied errors that simply disappeared after I rebooted.
  • Throw these lines in a ASP.Net wizard generated project’s Default.aspx to see the basic Cert info… the .Subject property is the juiciest looking info, there may be other properties of value.
    • <%=Page.Request.ClientCertificate.IsPresent%>
    • <%=Page.Request.ClientCertificate.Subject%>
  • It’s probably also helpful to go ahead and make sure your server side SSL cert is properly named & not expired, such that you don’t get any warnings when you browse to the page… I was getting some errors related to that when I was working with the Client Cert’s required.
    • this reference was helpful, see the section titled “Generate a Self Signed Certificate with the Correct Common Name”
    • this is the basic command you need to generate your own SSL cert for testing: SelfSSL /N:CN=www.whatever.com /V:9999
    • find SelfSSL in the IIS6 Reskit

image image

Self Refresher

  • .Net Framework v4 New Features
    • Parallel Linq Extensions
  • C# 4.0 New Features (all good stuff IMPO, variance being the hardest to grok)
    • Named and Optional Parameters – already use this quite a bit
    • Dynamic Support – handy way to ignore the complexity of ‘dynamically’ generated type declarations (e.g. linq results & COM Interop)
    • Co/Contra-Variance – primarily intended to make .Net Framework methods with Generic type parameters like IEnumerable<T> “work like we’d expect” as is often quoted in explanatory texts (look for Jon Skeet and Eric Lippert).  It removes a couple rather unexpected annoyances that C# 3 would’ve snagged on.
      • Covariance represents “upcastable”.  Concerns types coming out of an API.  Represented by “out” keyword, e.g. public interface IEnumerable<out T>
      • Contravariance is “downcastable”. Typically concerns types passed into an API.  e.g. public interface IComparer<in T>
      • Invariance is when something must go both in and out of a method and it can’t be declared differently on either side of the interface, it must be the same coming and going.
    • COM Interop
      • Dynamic Vars
      • Optional Parms
      • Optimized interop assembly file size
  • WPF4 New Features
    • New Controls – *DataGrid*, Calendar/DatePicker
    • Touch
    • Fancy Win7 Taskbar support
    • Easements
  • Silverlight 4 New Features
    • New Controls – ViewBox (auto resize), RichTextBox
    • Out-Of-Browser Support
    • Printing support
    • Drag and drop, clipboard access
    • More WPF compatible XAML parser
    • DataBinding improvements – StringFormat, collection grouping, INotifyDataErrorInfo or IDataErrorInfo support,
    • WCF Data Services enhancements – Direct POCO mapping via Linq queries on Open Data Protocol feeds.
    • Dynamic Support

Configuring a Windows 7 PC as a WiFi Router

Update 2011-07-11: Primary WiFi client user ran into dismal buffering on video streaming… that’s primary usage scenario so PC as a Router is a NO-GO.  I loaded DD-WRT (following the wiki guide) and it’s working much better… should have done that in the first place, thanks bro! :)  (read something about a port forwarding bug in the standard build and went with the recommend VPN build)

I finally gave up on my piece of sh_t Linksys WRT310N as a viable router… I can’t believe those guys can sell such crap… even on the latest firmware (09/09/2010 v1.0.10 build 2) it would crash and crash… I tried mixed mode, G only & N only and whenever it would have to do any significant WiFi traffic at all, it would fail… just absolute junk… amazing there’s even a market for those bricks… plus the HTTP menus were pathetically slow when you’d click around.

To be fair, it is a “v1” hardware model and apparently there is a v2 out there going by the Linksys firmware downloads page. (My serial #: CSF01HB0919)

Since my mobo has a built in WiFi NIC, I decided to see how hard it would be to just use what I already have rather than dinking around with finding another router that would actually work.

As with anything, there are pros and cons… here’s a few off the top of my head:
  • PRO: you gain quite a bit of control leveraging less overall equipment (software firewalls are generally much more robust than a consumer router)
  • CON: you have to have your central PC powered up for any household WiFi action… in our case that seems inherently ok… wifey can hop on the central PC if I’m not using it… and if I am, then WiFi is available.

Bottom line, this works and covers all my bases so far:

Windows 7 as a Wireless Access Point
  • one time: netsh wlan set hostednetwork mode=allow ssid=XYZ key=PDQ keyUsage=persistent
  • after every reboot: netsh wlan start hostednetwork
ICS – Internet Connection Sharing Snap7
DynDNS update client The DynDNS update feature is common to all routers… it’s nice that such a simple app alternative plugs this hole so I can keep on rocking my personal domain (I host all our photos directly from my home PC via zenPhoto).
Firewall settings Since I’m plugged into a cable modem now, my PC is basically swinging directly out on the net so a software firewall is much more important now than before when I’d be more safely behind the NAT barrier of the router. 

I use the 100% free Comodo Internet Security… the UI is clean, e.g. one can resize it’s data grid based screens to view full detail (yes I’m talking about you BitDefender 2010!), I’ve never seen it jack CPU, and it provides a good mix between wizard style prompting and completely granular manual editing of the low level firewall rules.

Firewall configs are always “fun”… What worked for me just now was to select “Stealth Ports Wizard” and choose the “Alert me to incoming connections and make my ports stealth on a per-case basis” option.

*PLUS* the following individual rules under Firewall > Network Security Policy > …
(don’t forget to move them to the top so that they override any other block rules in the same bundle)

  • Application Rule on C:\Windows\System32\svchost.exe
    • For external HTTP/FTP hosting: Allow TCP Or UDP In/Out From MAC Any To BeejQuad Where Source Port Is Any And Destination Port Is In [HTTP/FTP Ports (21,80,443)]
    • For ICS client DNS “passthrough”: Allow And Log TCP Or UDP Out From In/Out [WiFi Home Access Point] To MAC Any Where Source Port Is Any And Destination Port Is In [DNS Ports (53)]
      • (interesting, normal pings would resolve fine with simple *in* enabled, but an SSL web site from the ICS client required *out* enabled as well, the firewall logs also showed a blocked packet coming from an external ip on port 53 to my central PC on a random port, but that didn’t seem to hurt… maybe my network buddy can explain this stuff)
  • Global rule
    • For ICS client Ping/ICMP support: Allow ICMP In/Out From In [WiFi Home Access Point] To MAC Any Where ICMP Message Is Any

I use Gibson Research’s “Shields Up!” (GRSU) online port scanner to check whether I’ve made any progress…

Interestingly, Comodo immediately prompted me for port 80 when GRSU scanned, but I had to use the above Stealth Ports selection to allow my port 21 rule to take effect.

    Grand Mal Mocha

    Ode to Griffin AirClick USB - Radio Frequency PC Media Remote

    This little bugger just so totally rocks!!!  IMHO the most compelling aspects are:

    • It’s cheap :). They tend to go for around $10-$25. There are still some out there on eBay from time to time (not now) and also Amazon at the moment.
    • It’s Radio Frequency technology – so you can zap iTunes to the next song from around the corner or out in the yard!!  Even my fancy iMON VFD remote is Infrared based (limited by line-of-site) and that winds up being a deal breaker in my environment… couch faces projector wall away from the PC, IR = major fail! :(
    • It’s simple! – there are only the 5 most critical buttons to distract you… none of that typical Windows Media Center remote overload to worry about here… Play/Pause, Previous, Next & Volume Up/Down, that’s it.

    Unfortunately, the vendor, Griffin, has chosen to discontinue this little wonder.  If you’re interested in driving your PC based Media Players, make sure get the USB version, not the iPod version which appears to still be in production. Take note, the transmitters that come with the readily available iPod version are 100% compatible with the USB receiver. This is a nice way for us to obtain replacement transmitters to have around.  Just check eBay… I just got a pair of clickers, including the iPod receiver and an interesting Velcro mount for $4.50, including shipping!!!

    Griffin is nice enough to continue hosting their support page with drivers <whew>.  These native drivers work on any 32bit Windows since XP (more on 64bit support below).

    And Dmitry Fedorov has been keeping the dream alive by showing us how to build our own little application-specific .Net plugins for the basic Griffin driver API.

    AirClickOk so that’s all fine and dandy, now let’s get to the good stuff!!
    I currently like iTunes and VLC and Windows 7 64bit and I’ve found a couple free wares that well, make beautiful music together (couldn’t resist :)

    iTunesControl – In his infinite wisdom, Mr. Jobs hasn’t seen fit to support *global* Windows Media Keys in iTunes … fortunately for us, Carson created iTunesControl. Within the HotKeys section, one must simply click the desired key event (e.g. “Next Track”) and then press the corresponding AirClick button to make the assignment (Don’t forget to hit the Apply button).  It also provides a very nifty, super configurable Heads Up Display that I absolutely love. To be more specific, I only mapped Play/Pause, Next & Previous this way.  I left volume up/down defaulted to Windows global volume which provides convenient system wide volume control no matter what’s active (see last paragraph).

    Now, the dark clouds started rolling in when I upgraded to Win 7 64bit and realized that the basic Griffin software install does not happen under 64bit, zip, nada, no-go <waaahh>… then I found this next little gem, affectionately called…

    AirClick Interface Script - The way Jared explains it, fortunately for us, at least the HID layer of the Griffin driver is operational under 64bit. So he wrote an AutoHotKey script which picks up on the HID messages coming from the AirClick and turns those into Windows Media Keys.  The WinMedia Keys are then caught by iTunesControl and iTunes immediately does our bidding, brilliant! Jared provides his original script source as well as a convenient compiled exe version that you just run and go.

    AirClick_DiagramNOTE: Jared’s script maps a 4 second press of the volume-down to PC go night-night. To me this isn’t so handy and I much rather have a repetitive volume adjust when held down. So I tweaked his script a little, find that here (ready-to-run EXE version). If you wish to run this raw script or perhaps compile some of your own tweaks, then you must use the original AutoHotKey. The newer “AutoHotKey_L” branch would not work for me.

    The last thing I’ll mention is subtle but neato… Jared’s script actually checks to see which window is active.  If none of the well knowners is focused (VLC, Winamp, MediaPlayerClassic, PowerDVD), then it defaults to firing Windows Media Key events.  The nice thing is, if say VLC is active, then Jared’s script fires VLC specific play/pause, rewind & fast forward keys … so if I’m bouncing around the house, iTunes is getting the WinMedia events… if I’m sitting down watching a movie, I just have to make sure VLC is the active window and iTunes is left alone, perfectly intuitive!

    UPDATE 10 March 2012

    It’s a nice pastime to watch a photo slideshow while listening to tunez. Previously I’d been using the Google Photo Screensaver. But we soon wanted the ability to back up and stare at one of the slideshow photos, via the remote. I found Photo Screensaver Plus by Kamil Svoboda to fit very well. Among other very robust functionality, it supports left cursor to back up in the photo stream and space to pause the slideshow. With that, I edited my new AutoHotKey script (exe) to provide the following:

    • when slideshow screensaver is not running, hold down play/pause remote button to start up screensaver slideshow
    • when slideshow is running, reverse button goes to the previous image and pauses the slideshow
    • when slideshow is paused, play/pause restarts the slideshow
    • otherwise all buttons pass through to media events as usual

    I really like how you can dual purpose the buttons depending on the context… that’s powerful.

    Kamil’s screensaver also provides a hotkey to copy the current image to a favorites folder, very cool.  And a hotkey to edit the image’s EXIF metadata – Name, Description & Comment.  The nifty thing there is we also publish our photos via a Zenphoto web image gallery. Once we edit the EXIF info in the screensaver, a little PowerShell script of mine refreshes ZenPhoto’s MySQL database entry for that image so the new image name and comments are immediately available for viewing and SEARCHING via the web gallery’s search facility, nice!  The PowerShell script uses Microsoft’s PowerShellPack to provide effortless FileSystemWatcher integration. We really do have everything we need to mix and match unintentional building blocks into our own satisfying hobby solutions these days with amazingly little effort. I mean, who could’ve anticipated using a screensaver of all things as a data entry front end?

     

    Hot Corners - This free tool does the job and AutoIT source code is provided.

    Evolving a custom ADO.Net based repository

    All code.

    Update 3/24/2015 - Google is shutting down code.google.com :( but provides easy migration to github: new source code link.

    Main WPF4 LOB project post.

    Concept: A framework for maintaining column specific repository consistency with database update “side effects”.

    Prerequisites:
    • Stored procedures = business rules - My data layer is basically a home grown spin on ADO.Net Typed DataSets.  i.e. “Business” Class wrappers around ADO.Net classes (DataSets, DataRelations, DataTables, DataViews, DataRows, DataRowViews, etc).  I like to keep the majority of my business rules in stored procedures (“procs”).  I've experienced sustained, maintainable progress on LOB projects facilitated by an evolving relational model.  It's often beneficial to meet growing awareness of business entity relationship requirements entirely in the proc queries with no changes necessary in higher layers.  Being able to change how a particular list is populated WITHOUT REQUIRING A BINARY RELEASE can be very powerful.  I realize this may all seem controversial to an OO mindset but it’s served me well over multiple *database* oriented projects. If your project is not inherently table oriented, please stop right here. This is very much a relationally oriented design approach. If one is fortunate enough to have the freedom to design the database as part of the overall solution scope and therefore stored procedures are fair game, then to not take advantage of procs as “business methods”, is throwing away a huge asset. If one is not that lucky, and I realize big corporate projects tend not to be, then I completely understand taking great OO measures to insulate one’s beautiful architecture away from the messy legacy database structure. EntityFramework welcomes you :)  Otherwise, I feel that remaining near and dear to ones mother database is a very fruitful relationship.  Procs are easily maintainable and deployable - no binaries, very scriptable.
      • Naturally, accepting dependence on a database for business rules does imply that our application must be generally connected, to a database. One could argue this doesn’t fly for disconnected client scenarios, i.e. mobile device. However, it’s not far fetched to have a local database which provides this support which then updates to the big mother database (cloud, etc) when connectivity is restored. One could still leverage the readily deployable nature of stored procs to provide the right business smarts to the local DB. Indeed, a tiered relational centric model vs typical tiered OO centric architectures which relegate relational technology to the last tier only :)
    • MS SQL Server 2005+ - This post includes the usage of the SS 2005+ "OUTPUT” syntax. As with anything, the underlying concepts I’m proposing could be reapplied to other platforms and frameworks (I’m not familiar but wouldn’t be surprised if Oracle allows something similar). T-SQL OUTPUT is just a nice syntactical convenience with mild potential performance benefit.
    imageBusiness Example
    To frame a case which demonstrates the need for side effects – take a look at the adjacent screenshot . The scenario is we’ve got a household with some people in it (aka members, aka clients). In this business domain only one person can be the sponsor of a household at any given time. Likewise there can be only one spouse set, the spouse which is not the sponsor. These designations are maintained as flags on the “Clients” database table. In this example, we’re exploring what needs to happen when the sponsor changes from one person to another. This can happen when the existing sponsor leaves the business system which grants this privilege, yet the spouse remains in the system and can therefore assume the sponsorship privilege and nothing else needs to change.
    So, in the UI the current sponsor is Sgt. John Snuffy. To effect this desired change, the user would select the “Set Sponsor” button on the spouse entry (Mrs. Jane Snuffy). As is typical tiered design, this button fires a Business Object method SetSponsor(…).

    By design, my Business Class methods tend to be fairly light wrappers around proc calls. For example (full source):
    public void SetSponsor(string NewSponsorClientGUID, bool FixExistingPackageLinks)
    {
      using (iTRAACProc Sponsor_SetSponsor = new iTRAACProc("Sponsor_SetSponsor"))
      {
        Sponsor_SetSponsor["@SponsorGUID"] = GUID;
        Sponsor_SetSponsor["@NewSponsorClientGUID"] = NewSponsorClientGUID;
        Sponsor_SetSponsor["@FixExistingPackageLinks"] = FixExistingPackageLinks;
        TableCache(Sponsor_SetSponsor);
        HouseMembers = HouseMembers; //for some reason OnPropertyChanged("HouseMembers") didn't refresh the Members Grid, i don't have a good guess but this little hack worked immediately so i'm moving on
      }
    }
    
    Line #8 above is the huckleberry.  It resides in the BusinessBase class… basically it fires the proc and then goes into the DataSet.Merge() logic explained below.
    While we’re looking at this code, let me quickly divert to explain the “Proc” class . Nutshell: Proc is a convenient wrapper around ADO.Net SqlCommand. Among other things it does the SqlCommandBuilder.DeriveParameters() + caching thing that you’ll find in many similar wrappers like this (e.g. Microsoft’s Data Access Application Block - I just didn’t fall in love with their API and wanted my own spin). DeriveParameters() removes the dreary burden of all that boring proc parm definition boilerplate code prior to each proc call (add param by name, set the datatype, etc.) and just pulls all that out of the database metadata that already knows all that information anyway - brilliant. Therefore we get right to the point of assigning values to named proc parms and firing the query. SqlClientHelpders.cs contains the Proc class as well as all kinds of data helper methods that have evolved over several projects. I wouldn’t want to start a database project without it at this point.
    iTRAAC is the name of the project I pulled this example from. iTRAACProc is a very light subclass that assigns a few common domain specific parms (e.g. UserID) before handing off to the base Proc class. Conveniently, the Proc class' parm[“@name”] indexer ignores anything that's not declared on the specified proc, so only procs that actually require these parms will receive them.
    Ok so back to our scenario… Besides setting the flag on Jane’s record to indicate she is now the sponsor, we also need to remove the sponsorship flag from John as well as flip the spouse flag from Jane to John (other queries and reports depend on having those flags consistent)… and oh, by the way, we also want to log all of this to the audit table so there’s a historical reference of what changes brought us to the current state of a household.  We want to drive all of this from the database proc logic and once the database has changed we want the UI to magically update to reflect all these changes and additions (including the new audit record aka “Diary” in the UI). So this is where we’ve arrived at what I call side effects (maybe there’s a better term?). That is - corresponding to a relatively innocent looking user action, our desired business rules will drive various values to be changed and entirely new rows to be added that are not directly maintained by the user. This is not simple CRUD table maintenance, this is real business rules with all the crazy interconnections that must be supported :)

    Update-proc example (full source):
    SET @TableNames = 'Client'
    UPDATE iTRAAC.dbo.tblClients
    SET StatusFlags = CASE WHEN RowGUID = @NewSponsorClientGUID THEN StatusFlags | POWER(2,0)
                      ELSE StatusFlags & ~POWER(2,0) END
    OUTPUT INSERTED.RowGUID, CONVERT(BIT, INSERTED.StatusFlags & POWER(2,0)) AS IsSponsor
    WHERE SponsorGUID = @SponsorGUID
    AND RowGUID IN (@OldSponsorClientGUID, @NewSponsorClientGUID)
    
    Line #1 is pertinent.  By convention, all procs which need to participate in the approach I’m proposing in this post, must have a @TableNames OUTPUT parameter.  This is a CSV list of table names corresponding to each resultset returned from the proc (in sequential order).  This way, the proc *generically* informs the datalayer what must be merged into the client data cache (repository).

    Line #5 above is cool - rather than reSELECTing the modified data...OUTPUT lets us leverage that UPDATE already knows what rows it hit. I dig it.  Back on the client side, the datalayer takes that PARTIAL (i.e. very column specific) result-set and Merges back it into the cache like so (full source):
    //nugget: DataSet.Merge(DataTable) has become a real linchpin in the whole data roundtrip approach
    //nugget: in a nutshell, update procs return a bare minimum of updated fields in a return resultset along with a corresponding CSV list of @TableNames
    DataTable cachedTable = dsCache.Tables[tableName];
    dsCache.Merge(incomingTable, false, (cachedTable == null) ? MissingSchemaAction.AddWithKey : MissingSchemaAction.Ignore); //PreserveChanges pretty much has to be false in order to count on what comes back getting slammed in
    
    The Big Picture
    The Big Picture
    What this approach tees up is that your procs can drive an unlimited amount of side effects which can be granularly returned to the client side cache.

    Since you can pick and choose exactly which columns are returned (via standard selects or OUTPUT clause) you can weave a fine tuned blend between exactly which fields are allowed to come back in the side effects and blast into the client cache vs what fields may have pending uncommitted user edits in the cache. That’s pretty cool.

    View->ViewModel (MVVM) environments with robust declarative databinding, like WPF, really shine when you see all of these side effects immediately manifest on the UI just by bringing the data back into the BusinessObject(DataSet) cache (that the UI is bound to).  The procs are very much in control of the business logic and ultimately what’s displayed, yet without being coupled to the UI. Great stuff.

    Additional perks in the code provided:
    • An interesting “union-like” benefit in the datalayer – I ran into requirements where the most appealing clean design was to modularize subroutine procs that would be called from other procs. Fair enough so far. On top of that I found need to return these field level data changes (aka side effects) for the same entity table, from multiple procs in the subroutine chain. e.g. Client –> Proc1 –> SubProc2 & SubProc3. The impact of burdening the T-SQL proc layer with capturing the multiple proc results and union’ing them together is ugly design. It wound up being very clean and convenient to defer the union of these multiple selects to the TableCache C# datalayer logic. The “union” effect is readily implemented by looping through the tables of the same name and using ADO.Net’s “DataTable.merge()” to apply each incoming rowset to the existing entities in the repository cache. Including matching primary keys in the incoming rowsets facilitates updates to cached entities vs inserts.
    • Handy initial client side rows – I should say, this next bit is actually a technique that’s struck me as convenient yet it’s not specifically dependent on the TableCache approach … these building blocks do all however play into each other to nicely address what I’ll call the “new row dilemma” … that is, one typically needs some blank rows to databind to when you’re creating a new record in the UI… but it’s often undesirable to physically manifest these rows in the database until you’re sure they’re really going to be committed… it really stinks to sacrifice data integrity constraints just to allow for initial empty rows… a typical solution is to DataTable.Rows.AddRow() on the client and leave the DB out of it until you commit fully validated rows… but now client code is responsible for initializing new rows. I hate that for a couple reasons. First, I want that logic in the procs, where I can evolve it at will at the database tier w/o needing to deploy a new client binary. Secondly, for client logic consistency, it’s much cleaner for new row logic to work exactly the same way as existing row logic. So the execution goes something like this:
      1. New row event on client generates a brand new GUID PK (Some form of very unique ID seem fairly necessary to allow the client to do this autonomously from the DB).
      2. But otherwise the client logic just flows into the standard “GetEntityByID” proc call, passing the new GUID none the wiser whether it’s new or existing… i.e. zero logic flow difference between new record and vs existing record, nirvana :).
      3. Of course this fresh GUID won’t get a row hit which conditionally falls into the new row logic where I return a “fake” row populated with whatever defaults I desire… take note, I’m not actually inserting a row into the table and then selecting that back out, I’m doing a select with “hard coded” values and no “from table” clause… that way I don’t insert junk data nor forsake constraints, but the new row logic is kept in the proc layer – beautiful.
      4. Lastly, when committing to the DB, you fire the typical upsert proc which checks if it's doing an insert or update by seeing if the PK exists and acting accordingly.
    Dear reader what do you think? Am I nuts? Am I bucking the OO trend too much? Will the OO gods conspire to make me pay dearly for my database blasphemy :)

    Going Portrait!

    Do you realize how freakin’ cool it is to be able to hit ALT-F1 on a table in SSMS and be able to immediately view all of the output without futzing around in all the little result-set scroll bars!?! There’s 6 tables that come back from sp_help and now they have the room they deserve… the visual query plan tool is more horizontal so I have a feeling that’s going to take a little hit… we’ll see. Great for dual pane WPF dev in VS2010 too… with a typical visual pane up top, there’s now tons more room for raw XAML in the bottom … and XAML gets verbose in a hurry so this was becoming a critical annoyance for me.

    And not only dev oriented activities, Outlook feels better too… and it’s amazing how many web pages seem like they were made for portait… so nice to see a whole page at once w/o a scroll bar.

    Bottom line: THE most dramatic yet drop dead easy computer improvement I’ve done in a long time.

    clip_image001

    Transcoding Motion-JPEG (.MOV) to MPEG-4 (H264)



    Newer Approach





    I have a Panasonic DMC-ZS5 (dpreview, Panasonic) which creates .MOV files that contain the Motion JPEG (M-JPEG) “format” (for want of a more technical term).

    In order to stream those videos from my IIS/PHP based photo gallery (zenPhoto.org), they must be converted to a more “web compatible” format like MPEG-4.  I haven’t found a more straightforward approach than direct batch conversion to another format… you can readily automate the conversion of say all the videos in a folder so it’s pretty much turnkey and ignore.

    Update 2015-01-05: This is my current go-to:
    for %v in (*.mov) do ffmpeg -i "%v" -vcodec h264 -acodec aac -strict -2 "%~nv.mp4"
    
    Notes:
    • make sure to double up all the %%v if you put in .cmd batch file
    • ffmpeg is a very popular 3rd party command line util. I get mine from here.

    Update 2015-07-18: cropping 3D movies down to single image
    ffmpeg -i "in_movie_file.ext" -vf "crop=960:800:0:0,setdar=4:2" -vcodec h264 -acodec aac -strict -2 "out_movie_file.mp4"
    • obviously check the real resolution before setting the crop... just divide it by 2
    • "setdar" is the aspect ratio... i found it was necessary... one way to find it is with VLC CTRL-J on the original video

    VLC will do this via a command line like so:
    "c:\Program Files (x86)\VLC\vlc.exe" -vvv %1 --sout=#transcode{acodec=mpga,vcodec=h264,venc=x264,deinterlace,vfilter="rotate{angle=270}"}:standard{mux=mp4,dst="%~n1.mp4"}, vlc://quit
    

    Notes:
    • I’ve had to remove the acodec=mpga for my iPhone MOV’s or else I get garbled audio.
    • I included the vfilter=”rotate…” for rotation syntax since it was so hard for me to find but only include if you want rotation.

    However, I noticed that VLC chops off the last 2 seconds no matter what I do… it seemed a little better choosing a different vcodec but h264 is too rocking to use anything else.

    So I wound up going with QuickTime as my go-to transcoder for now.  It doesn’t truncate any video and creates a slightly smaller output file than VLC.  The compression is dramatic and h264 does an awesome job with preserving quality… even while maintaining 1280 x 720 HD, a 100MB MJPG will go down to a 5MB h264/MPEG file.

    Following code stolen from here and tweaked a little, automates the QuickTime COM API to convert a directory full of MJPG’s (see sample code for Chapter.8 > “BatchExport.js”).

    There’s no reason why this shouldn’t be in PowerShell… it’d be interesting to see if it was any more readable.
    //----------------------------------------------------------------------------------
    //
    //    Written by    :    John Cromie
    //    Copyright    :    ? 2006 Skylark Associates Ltd.
    //                                                                               
    //    Purchasers of the book "QuickTime for .NET and COM Developers" are entitled   
    //    to use this source code for commercial and non-commercial purposes.                   
    //    This file may not be redistributed without the written consent of the author.
    //    This file is provided "as is" with no expressed or implied warranty.
    //
    //----------------------------------------------------------------------------------
     
     
    function unquote(str) { return str.charAt(0) == '"' && str.charAt(str.length - 1) == '"' ? str.substring(1, str.length - 1) : str; }
     
     
    // Run from command line as follows:
    //
    // cscript BatchExport.js , , , , , 
     
    var sourcePath, destPath, configXMLFilePath, convertFileExtension, exporterType, exportFileExtension;
     
    // Get script arguments
    if (WScript.Arguments.Length >= 4)
    {
        sourcePath = unquote(WScript.Arguments(0));
        destPath = unquote(WScript.Arguments(1));
        configXMLFilePath = unquote(WScript.Arguments(2));
        convertFileExtension = unquote(WScript.Arguments(3));
        exporterType = WScript.Arguments(4);
        exportFileExtension = WScript.Arguments(5);
    }
     
    //sourcePath = "D:\\QuickTime\\Movies\\Birds\\Kittiwake";
    //destPath = "D:\\QuickTime\\Movies\\Export\\Dest";
    //exporterType = "BMP";
    //exportFileExtension = "bmp";
     
    // Sanity check arguments
    var fso = WScript.CreateObject("Scripting.FileSystemObject");
     
    var e = "";
     
    if (!fso.FolderExists(sourcePath))
        e += "Source path does not exist : " + "[" + sourcePath + "]\n";
        
    if (!fso.FolderExists(destPath))
        e += "Destination path does not exist : " + "[" + destPath + "]\n";
     
    if (!fso.FolderExists(configXMLFilePath))
        e += "Config XML file path does not exist : " + "[" + configXMLFilePath + "]\n";
     
    if (convertFileExtension == undefined)
        e += "No convert file extension supplied!\n";
     
    if (exporterType == undefined)
        e += "No exporter type supplied!\n";
        
    if (exportFileExtension == undefined)
        e += "No exporter file extension supplied!\n";
     
    if (e != "")
    {
        WScript.Echo(e);
        WScript.Echo("Usage:");
        WScript.Echo("cscript BatchExport.js , , , , , ");
        WScript.Quit();
    }
     
    // Launch QuickTime Player   
    var qtPlayerApp = WScript.CreateObject("QuickTimePlayerLib.QuickTimePlayerApp");
     
    if (qtPlayerApp == null)
    {
        WScript.Echo("Unable to launch QuickTime Player!");
        WScript.Quit();
    }
     
    var qtPlayerSrc = qtPlayerApp.Players(1);
     
    if (qtPlayerSrc == null)
    {
        WScript.Echo("Unable to retrieve QuickTime Player instance!");
        WScript.Quit();
    }
     
    // Set up the exporter and have it configured
    var qt = qtPlayerSrc.QTControl.QuickTime;
    qt.Exporters.Add();
    var exp = qt.Exporters(1);
    exp.TypeName = exporterType;
     
    // settings file...
    var FileSystemObject =  WScript.CreateObject("Scripting.FileSystemObject");
    var configXMLFileInfo;
     
    if ( FileSystemObject.FileExists(configXMLFilePath) )
        configXMLFileInfo =  FileSystemObject.OpenTextFile( configXMLFilePath );
     
    // if settings files exists, load it and assign it to the exporter
    if ( configXMLFileInfo )    {
        var configXMLString = configXMLFileInfo.ReadAll();
        // cause the exporter to be reconfigured
        // http://developer.apple.com/technotes/tn2006/tn2120.html
        var tempSettings = exp.Settings;
        tempSettings.XML = configXMLString;
        exp.Settings = tempSettings;
    } else  {
        //otherwise, get the settings from the user dialog and save them to xml file for subsequent runs
        exp.ShowSettingsDialog();
     
        var configXMLString = exp.Settings.XML;
        configXMLFileInfo = FileSystemObject.CreateTextFile( configXMLFilePath );
        if ( configXMLFileInfo )  {
            configXMLFileInfo.WriteLine(configXMLString);
            configXMLFileInfo.Close();
        } else {
            WScript.Echo("Unable to create config XML file : " + "[" + configXMLFilePath + "]");
            WScript.Quit();
        }
     
    }
     
     
    var fldr = fso.GetFolder(sourcePath);
     
    // Regular expression to match file extension
    var re = new RegExp("\."+convertFileExtension+"$", "i");
     
    // Iterate over the source files
    var fc = new Enumerator(fldr.Files);
    for (; !fc.atEnd(); fc.moveNext())
    {
        var f = fc.item().Name;
        
        // Filter by file extension
        if (!re.test(f))
            continue;
        
        try
        {
            // Open the movie and export it
            qtPlayerSrc.OpenURL(fc.item());
            
            var mov = qtPlayerSrc.QTControl.Movie;
            if (mov)
            {
                exp.SetDataSource(mov);
                
                // Strip file extension and compose new file name
                f = f.replace(/\.[^\.]*$/, "");
                var fDest = destPath + "\\" + f + "." + exportFileExtension;
                
                exp.DestinationFileName = fDest;
                exp.BeginExport();
                
                WScript.Echo("Exported: " + fDest);
            }
        }
        catch (err)
        {
            WScript.Echo("Error Exporting: " + fc.item());    
        }
            
    }
     
    // Tidy up
    qtPlayerSrc.Close();
    

    Garmin 10x Bluetooth GPS Receiver Broken Clip Hack

    I really really dig how this popped together… reason, the stock Garmin plastic clip busted on me last saturday

    imageimageDSCF5074 -notesimage

    Understanding Exposure

    This is a great book by a Mr. Bryan Peterson

    Make sure you get the latest edition (currently Aug.2010).

    Bryan has a an easy going writing style packed with tons of real examples.

    It’s not a very long book (~175 pages) and there are lots of great example photos filling up nearly every page.

    It is highly rated on Amazon… only $20 with shipping.

    Basic takeaways for my own future reference:
    • Bryan uses the term “Exposure Triangle” to relate the three interrelated fundamentals of capturing ideal exposure: F-stop (aperture size), Shutter Speed and ISO.
    • Photography of course has a lot to do with *light* … how much light we have to work with and how much we want to let into the camera.
    • ISO
      • Undoubtedly the toughest one to grasp at the pure physics level. I appreciate how Bryan uses the simple metaphor of "worker bees" here.
      • The higher the ISO, the more worker bees you have on your electronic photo sensor gathering the light particles. There is of course a trade-off and I think we've all observed the grainy result of too much ISO.
      • It’s great to read the specific values & tips Bryan recommends throughout the book… e.g. setting your ISO higher than 200 starts to lose contrast and color saturation.
    • f/stop (aperture size)
      • aperture is literally the size of the opening allowing light to enter the camera… Yet just controlling the amount of light is not really the most useful aspect of f-stop…
      • More importantly f-stop is what determines the “depth of field” in a photograph.
      • F/4.5 is a very “average” middle point to begin with.
      • The *higher* the F/# the *smaller* the aperture (light opening) (because the number is on the bottom of a division)
      • Smaller apertures wind up pulling in a greater “depth of field” which just means more of the background is sharper.
      • Wider apertures (*smaller* f/#’s, i.e *divided* by a smaller number = bigger) give that fuzzy background effect (bokeh)… typically when you want to draw the most attention to a specific subject vs a complex background.
    • Shutter Speed
      • this is the most intuitively obvious one in my mind… it primarily determines whether you capture motion or not… a quick shutter “stops action”… a slow shutter gives that more blurry look to moving objects like water.
    We then balance all three of those in our “Exposure Triangle”…

    For one example, starting with the desire to have full focus on a long view (e.g. big field of flowers), we select a high f/stop. If it's a bright easy light day, we can leave our ISO low … lastly we move the shutter speed up or down until our camera’s light meter falls on ZERO.

    I had never been clued in on that fundamental part about adjusting one or the other (aperture or shutter) in order to *move*the*light*meter* bar back to center zero.  One typically does this looking through the viewfinder at the little gauge of vertical bars with 0 in the middle.


    This was a pretty big revelation for me.  Maybe I'm particularly clueless :) and it's considered so obvious that it's not worth mentioning; but I also wonder how many people carry around multiple hundred dollar cameras without knowing this.

    For an alternative example, if we want to capture that “blurry water” effect on a stream or a waterfall, we’ll start with a longer shutter to (e.g. 1/8 sec or even 1 full sec) and then move the f/stop to get the light meter to 0… the f/stop will be high in this case (perhaps even f/32) because a long shutter is a long exposure to light and therefore a correspondingly small opening is necessary to counteract that light washout (i.e. overexposure).
    Understanding Exposure, 3rd Edition - Bryan Peterson (2010.08) -cover image
    ISBN-10: 0817439390
    ISBN-13: 978-0817439392

    YASBE – Open Source Code Incremental Backup Windows WPF Application

    The reason I started this little project is none of the premier backup packages currently support Blu-ray… I know that sounds amazing but check out the help forums for stuff like Acronis and Paragon and Yosemite… it’s not a pretty picture out there currently with regards to Blu-ray… and of course, I had already bought my BD drive before I started to realize how dismal this all was… so I was inclined to find a solution :)

    I’ll admit right up front, the UI is a a bit cluttered and terse… classic, good-enough-for-own-purposes-in-the-time-i-had syndrome

    image
    • full source svn repo Update 3/24/15 - Google is shutting down code.google.com :( but provided easy migration to github: New source link.
    • Basically, it just works like a champ… I really like how it came together… WPF is awesome… it all feels *very* peppy & responsive on my “aging” Quad Core 2…
    • currently implemented on sql server 2008 (express)… should be relatively database agnostic in theory, but…
    • The one big sql server 2008 dependency that I do use is SQL Server table-valued stored proc parameters.
    • install the default database structure via .BAK file
    • This SQL Server table proc parm approach is a particularly fun optimization I’ve been itching to implement to see how it hangs together in lieu of using it elsewhere (whenever I can finally get my work to upgrade to SQL Server 2008!!! :)
    • Anyway, as far as the actual application goes, see screenshot, it’s WPF 4 code with a lot of little tricks I’ve learned along the way with my other much larger scope WPF LOB project at work.
    • YASBE (“Yet Another Simple Backup Enabler”) immediately presents the typical checkboxed include/exclude filesystem tree where you select which folders are in and out… you can of course simply select a root drive letter if you’re organized to have everything you care about on one big data drive.
    • I underestimated the complexity of rolling my own folders treeview but I like the work I achieved in the .Net IO FileSystem code & the corresponding WPF TreeView XAML here (search for “TreeView”)… I’ve seen other examples of loading a WPF TreeView (telerik knowledge base etc)… but I feel like i did mine a little tighter… easier to copy/understand I think… the tree is efficiently lazy loading… ie it only scans the next set of folders down when you expand a parent
    • Then one would typically hit the “Select Next Disc's Worth Of Files” button and YASBE cranks down the list until it’s included 25GB worth of new/changed files that are candidates for going to a Blu-ray disk.
    • the .Net DirectoryInfo.GetFiles() appears to be adequately performant on my average desktop hardware … it scans my 200GB+ of photos and other important documents in <16 seconds… actually it scans all those files, –AND- uploads it to sql server (via table stored proc parameter) and does the comparison to all the previously recorded date stamps to determine what is new/changed… –AND- sends that recordset back to the client and displays it on a datagrid, all that in 16 seconds… I’m absolutely pleased with that… I feel that the master blast of all those file records up to SQL Server using the table valued stored proc parm really nicely optimizes that step.
    • Then one would hit “copy to staging folder”… wait quite a bit longer to copy 25 GB to your Blu-ray’s staging folder (actually it’s effectively more around 23GB max from what I’ve read)
    • Then I highly recommend you burn your Blu-ray by drag/dropping your burn staging folder into Nero BurnLite (which is free)
    • Nero BurnLite has been 100% reliable for me and it’s a perfectly bare bones data disc burning software, exactly what I want, without any other fluff.
    • I had *major* reliability problems with Windows 7's built-in disc burning facility!!!… I coastered 5 out of 6 tries before I bailed and went to Nero… it becomes mentally painful trial and error at 25GB a pop for a cheap arse like me :)… Yet Win7 seems absolutely fine for DVD/CD burning… I’ve burned those successfully w/o a glitch.
    • Here’s the interesting anecdotal evidence, after the burn, Nero spews out a list of mandatory renames for files that somehow wouldn’t fit the disc’s file system… which is UDF I believe… I’m wondering if Win7 doesn’t perform that kind of necessary bulletproofing and that’s why the burns would always fail several minutes in, after wildly jumping around between random %complete estimates and a schizophrenic progress bar.
    • Nero methodically clicks off percent-completes nice & fast … seems like 25GB only takes about 15mins… very doable… I did 8 x 25GB discs to cover my whole photo library while working on other things and it went by like clockwork.

    Our 4-Hour Body Recipe

    Link to Amazon

    Ferris gets right to his go-to Mexican oriented mix pretty quick into the fat loss section of the book… it’s very quick reading to pick up his basic approach.   Here’s how we’ve taken that and made it our own:

    • Full Size White Onion chopped up (probably any kind of onion will do)
    • Full Size Tomato chopped up
    • Full Green or Red Bell Pepper chopped up
    • Can of Black Beans (pinto and kidney work as well of course… we just really like black beans)
    • Grilled Large Chicken Breast

    Those items are all basically on a 1:1 ratio. *5* of each fills our 6 quart “stock pot”, which will last the two of us through a work week. Along with those primary stock items, we also flavor in some diced garlic… I love garlic, I go kind of nuts with it.

    To spice it up, I do a full 16 oz. jar of Vlasic *hot* pepper rings (what I call pepperoncini's)  -AND- a full 12 oz. jar of sliced hot jalapenos… including the juice from both (keep an eye out for corn syrup here)… those give it a fun kick… which should also help manage appetite. I think you’d want the full 5x of everything else to take on those full jars of spiciness… please start out with less until you find your preferred balance.

    To help freshen up each reheated serving, we melt in some grated cheese (keep it very minimal since this is on the avoid list), then toss on a dollop of sour cream and some avocado slices.

    Tim gives pointers on what to avoid just as much as what to embrace… pretty much all fruit sugars are on the avoid list, no big surprise. Another is to avoid many things that are white due to flour or starch, which is also a fairly common thread of advice from other dietary sources.

    It’s nice to load up the digital version of the book and hit the hyperlinks to the references… the weight tracking spreadsheet, etc.

    Good luck with your goals and have fun! :)

    HighPoint RocketRAID 620 indeed works for Hackintosh

    Update [2011 Aug 6]: The original drivers appear to work just fine under Lion v10.7
    Please see here for background on the “main PC = NAS” approach this hardware facilitates.
    And here for my other Hackintosh tribulations with getting my old graphics card to work.
    I’m very satisfied for a $60 part… the drivers loaded right up under both Win7 and OS X v10.6.6 (and 10.7 currently)
    As a side note: This all works well in tandem with Parallels Desktop v6’s Boot Camp virtualization facility where I can dual boot into my one sole Windows 7 install *natively* or via a Parallels VM under OS X (I know VMware has something identical but from what I’ve read, Parallels still has the edge on performance).
    The drivers on the install disc were up to date… and I’m taking it as a good sign that they haven’t found need to update them for over a year now.
    Windows 7 Driver – currently: v1.1.9.1221, 12/21/2009
    OS X Driver – currently: v1.1.0, 12/22/2009
    There is the usual BIOS based boot time configuration screen you can pop into to manage your arrays.
    And you can also install a management “Web GUI” … this is obviously driven by a mini web server that runs under your OS on a certain port… this is *NOT* plugging an ethernet cable into the RAID card itself… it is not that sophisticated… the whole thing is very bare bones, very old school but seems to have the basics covered (time will tell)… it’s loaded via an old Installshield style setup.exe that I recognize from the early 90’s … the web screens themselves are completely boring old school stuff which stands out in a bad way these days but truly, <ValleyGirlMode> whatevers </ValleyGirlMode>.
    looking at the benchmark from CrystalDiskMark… those sequentials look respectable but I guess the other rates are pretty poor???
    those specs are running the RocketRAID on these drives: Hitachi Deskstar HD32000 IDK /7K (2TB 7200 RPM 32MB Cache SATA 3.0Gb/s 3.5" Internal)
    image       image
    The card itself is very miniscule… about 2.75" inches square (see below)… it is a “1 lane” card (i.e. “X1” in the PCI-express common parlance)… but it is PCIe 2.0 so you absolutely want to put it in a 2.0 capable slot if you can and on my mobo that is a x16 lane slot… which “looks” like a waste but is totally fine for me because I’m not a gamer so I’m not using that secondary PCIe x16 slot for an SLI gfx card or anything useful anyway.
    DSCF5030 - closeup
    Checkout this last photo… I realized the RocketRAID card’s bracket alignment was off quite a bit (too short)… after installing, the card would slide itself loose of the slot… so much so that the mobo’s electric disconnect warning light for that slot came on… the bracket for my graphics card right next door doesn’t exhibit anything close to this height deficit so I’ve got to assume the RocketRAID is a bit out of spec… after scratching my head for a minute, the obvious solution that presented itself was to move the bracket *under* my case’s card stability rail... it seems like my Antec Skeleton’s card rail particularly lends itself to this approach… I wonder if a normal case’s bracket screw down area would ?
    P1050814 - closeup

    Reclaiming disk space from “system volume information”

    This sums it up very well

    Don’t forget to run the VSSAdmin commands under a 64bit CMD.exe if you’re on 64bit Windows.

    I know that sounds obvious but I run something called TakeCommand Console LE which is a great shell but the free version is 32bit only.

    Visual Studio 2010 Slow Startup [resolved]

    Apparently the culprit of my slowdown was the VMware debugger integration… resolution here

    I merely uninstalled the VMDebugger component via VMware Desktop setup.exe …  I did *not* have to completely uninstall all of VMware Desktop to see an significant improvement to Visual Studio startup time… down to a few seconds now from something that felt like 30 seconds.

    Fix Orphaned SQL Server Database Users

    Good reference

    Quick examples:
    sp_change_users_login 'report' -- this shows the list of orphaned users in the current database
    sp_change_users_login 'update_one', 'iTRAAC_User', 'iTRAAC_User' -- this is how you remap one
    

    Projector Roundup 2011 Q1

    For my own comparison notes, I got an "Epson PowerLite Home Cinema 720" for $1000 back in May 2009 (according to ProjectorCentral.com it’s already discontinued, I’m not surprised, this is a rapidly evolving technology segment)… so our specs are: 720p, 1600 Lumens, 10,000 contrast, 3000 hour bulb, 2 yr warranty.

    The Epson Home Cinema's seem to be hanging strong in the top 10 most popular slots over the last few years.

    The Epson 8350 seems to be their latest best in the $1000 range ($1200 street 14 Feb 2011)

    1080p, 2000 lumens, 50,000 contrast, 4000 hr bulb, 2 yr warranty.

    Reviewers seem to indicate the Epson 8700 UB ("ultra black") is well worth the extra bux ($2100 current street)

    1080p, 1600 Lumens (1800+ according to review), 200,000 contrast, 4000 hr bulb, 2yr warranty.

    The comparison to long standing champ "Panasonic PT-AE4000U" at the end of this review is interesting:

    One thing to really look for is the *free bulb* (~$300 value) & other mail in rebate specials that vendors use to drive attention for higher end models, especially for the first year or so.

    I was kicking myself because I had just missed the window of opportunity for these rebates on mine by the time I was ready to buy.

    I feel like a free bulb takes a lot of the worry out of these suckers... they say about 3yrs a bulb for average use...

    3 yrs is just long enough that you're going to loathe buying a new bulb vs a whole new projector technology.

    6 yrs is great peace of mind.

    The 8700UB just came out in Oct.2010 and apparently the free bulb offer goes to March *2012*

    and there's also a $100 mail in rebate that goes through March *2011*

    CTRL, 1 – Mutes Windows Sound?!?!

    Is this a standard???  I can’t find reference to this hot key sequence out there in Google land.

    You need to hit CTRL, release and then hit 1, not CTRL-1, i.e don’t hold down CTRL.

    On my (*non* Windows Media Keys Keyboard under Windows 7) system here, it works with both left and right CTRL and main keyboard or numeric keyboard “1”.

     

    Google Keywords: Windows 7, Audio, Mute, Volume, Control, CTRL, 1, One

    150W Auto DC to AC Inverter with USB

    Here’s the one I got:

    image

    Input: DC 12V
    Output: AC 110-120V, 60Hz, 150W
                USB DC 5V, 500mA

    I use it for our camera battery chargers when we go on road trips.
    It's a pretty compact arrangement to plug a cam.batt.charger into this and forget about it.
    And it's nice to be able to toss our iPods, Smart Phones, etc on the USB port.

    I've read that charging camera batteries requires more juice than you can get out of USB,
    and that's probably why those are all A/C wall plugs only.

    They don't show it in the picture but on the reverse side of the unit is a little 20v *external* fuse (i.e. easily user serviceable).
    You don't always see that kind of feature with these kinds of cheapo things.

    There's actually a little fan in that box which is probably a good thing.
    Unfortunately you can hear it when it cranks up... it's not a deafening noise but you can hear it.

    I actually don’t like Xoxide that well… they definitely bungled my last order… you might want to source this somewhere else… the model number appears to be “DAU-150”… I can’t identify a manufacturer name anywhere on the outside…

    Here’s a Google Search for similar items (inverter is a key word to include in your searches)… there’s plenty of good options out there…

    Here’s one on eBay that looks just like mine but also sports a dual Euro/US compatible AC socket.