/* BeejBlog */

Ode to Griffin AirClick USB - Radio Frequency PC Media Remote

This little bugger just so totally rocks!!!  IMHO the most compelling aspects are:

  • It’s cheap :). They tend to go for around $10-$25. There are still some out there on eBay from time to time (not now) and also Amazon at the moment.
  • It’s Radio Frequency technology – so you can zap iTunes to the next song from around the corner or out in the yard!!  Even my fancy iMON VFD remote is Infrared based (limited by line-of-site) and that winds up being a deal breaker in my environment… couch faces projector wall away from the PC, IR = major fail! :(
  • It’s simple! – there are only the 5 most critical buttons to distract you… none of that typical Windows Media Center remote overload to worry about here… Play/Pause, Previous, Next & Volume Up/Down, that’s it.

Unfortunately, the vendor, Griffin, has chosen to discontinue this little wonder.  If you’re interested in driving your PC based Media Players, make sure get the USB version, not the iPod version which appears to still be in production. Take note, the transmitters that come with the readily available iPod version are 100% compatible with the USB receiver. This is a nice way for us to obtain replacement transmitters to have around.  Just check eBay… I just got a pair of clickers, including the iPod receiver and an interesting Velcro mount for $4.50, including shipping!!!

Griffin is nice enough to continue hosting their support page with drivers <whew>.  These native drivers work on any 32bit Windows since XP (more on 64bit support below).

And Dmitry Fedorov has been keeping the dream alive by showing us how to build our own little application-specific .Net plugins for the basic Griffin driver API.

AirClickOk so that’s all fine and dandy, now let’s get to the good stuff!!
I currently like iTunes and VLC and Windows 7 64bit and I’ve found a couple free wares that well, make beautiful music together (couldn’t resist :)

iTunesControl – In his infinite wisdom, Mr. Jobs hasn’t seen fit to support *global* Windows Media Keys in iTunes … fortunately for us, Carson created iTunesControl. Within the HotKeys section, one must simply click the desired key event (e.g. “Next Track”) and then press the corresponding AirClick button to make the assignment (Don’t forget to hit the Apply button).  It also provides a very nifty, super configurable Heads Up Display that I absolutely love. To be more specific, I only mapped Play/Pause, Next & Previous this way.  I left volume up/down defaulted to Windows global volume which provides convenient system wide volume control no matter what’s active (see last paragraph).

Now, the dark clouds started rolling in when I upgraded to Win 7 64bit and realized that the basic Griffin software install does not happen under 64bit, zip, nada, no-go <waaahh>… then I found this next little gem, affectionately called…

AirClick Interface Script - The way Jared explains it, fortunately for us, at least the HID layer of the Griffin driver is operational under 64bit. So he wrote an AutoHotKey script which picks up on the HID messages coming from the AirClick and turns those into Windows Media Keys.  The WinMedia Keys are then caught by iTunesControl and iTunes immediately does our bidding, brilliant! Jared provides his original script source as well as a convenient compiled exe version that you just run and go.

AirClick_DiagramNOTE: Jared’s script maps a 4 second press of the volume-down to PC go night-night. To me this isn’t so handy and I much rather have a repetitive volume adjust when held down. So I tweaked his script a little, find that here (ready-to-run EXE version). If you wish to run this raw script or perhaps compile some of your own tweaks, then you must use the original AutoHotKey. The newer “AutoHotKey_L” branch would not work for me.

The last thing I’ll mention is subtle but neato… Jared’s script actually checks to see which window is active.  If none of the well knowners is focused (VLC, Winamp, MediaPlayerClassic, PowerDVD), then it defaults to firing Windows Media Key events.  The nice thing is, if say VLC is active, then Jared’s script fires VLC specific play/pause, rewind & fast forward keys … so if I’m bouncing around the house, iTunes is getting the WinMedia events… if I’m sitting down watching a movie, I just have to make sure VLC is the active window and iTunes is left alone, perfectly intuitive!

UPDATE 10 March 2012

It’s a nice pastime to watch a photo slideshow while listening to tunez. Previously I’d been using the Google Photo Screensaver. But we soon wanted the ability to back up and stare at one of the slideshow photos, via the remote. I found Photo Screensaver Plus by Kamil Svoboda to fit very well. Among other very robust functionality, it supports left cursor to back up in the photo stream and space to pause the slideshow. With that, I edited my new AutoHotKey script (exe) to provide the following:

  • when slideshow screensaver is not running, hold down play/pause remote button to start up screensaver slideshow
  • when slideshow is running, reverse button goes to the previous image and pauses the slideshow
  • when slideshow is paused, play/pause restarts the slideshow
  • otherwise all buttons pass through to media events as usual

I really like how you can dual purpose the buttons depending on the context… that’s powerful.

Kamil’s screensaver also provides a hotkey to copy the current image to a favorites folder, very cool.  And a hotkey to edit the image’s EXIF metadata – Name, Description & Comment.  The nifty thing there is we also publish our photos via a Zenphoto web image gallery. Once we edit the EXIF info in the screensaver, a little PowerShell script of mine refreshes ZenPhoto’s MySQL database entry for that image so the new image name and comments are immediately available for viewing and SEARCHING via the web gallery’s search facility, nice!  The PowerShell script uses Microsoft’s PowerShellPack to provide effortless FileSystemWatcher integration. We really do have everything we need to mix and match unintentional building blocks into our own satisfying hobby solutions these days with amazingly little effort. I mean, who could’ve anticipated using a screensaver of all things as a data entry front end?

 

Hot Corners - This free tool does the job and AutoIT source code is provided.

Evolving a custom ADO.Net based repository

All code.

Update 3/24/2015 - Google is shutting down code.google.com :( but provides easy migration to github: new source code link.

Main WPF4 LOB project post.

Concept: A framework for maintaining column specific repository consistency with database update “side effects”.

Prerequisites:
  • Stored procedures = business rules - My data layer is basically a home grown spin on ADO.Net Typed DataSets.  i.e. “Business” Class wrappers around ADO.Net classes (DataSets, DataRelations, DataTables, DataViews, DataRows, DataRowViews, etc).  I like to keep the majority of my business rules in stored procedures (“procs”).  I've experienced sustained, maintainable progress on LOB projects facilitated by an evolving relational model.  It's often beneficial to meet growing awareness of business entity relationship requirements entirely in the proc queries with no changes necessary in higher layers.  Being able to change how a particular list is populated WITHOUT REQUIRING A BINARY RELEASE can be very powerful.  I realize this may all seem controversial to an OO mindset but it’s served me well over multiple *database* oriented projects. If your project is not inherently table oriented, please stop right here. This is very much a relationally oriented design approach. If one is fortunate enough to have the freedom to design the database as part of the overall solution scope and therefore stored procedures are fair game, then to not take advantage of procs as “business methods”, is throwing away a huge asset. If one is not that lucky, and I realize big corporate projects tend not to be, then I completely understand taking great OO measures to insulate one’s beautiful architecture away from the messy legacy database structure. EntityFramework welcomes you :)  Otherwise, I feel that remaining near and dear to ones mother database is a very fruitful relationship.  Procs are easily maintainable and deployable - no binaries, very scriptable.
    • Naturally, accepting dependence on a database for business rules does imply that our application must be generally connected, to a database. One could argue this doesn’t fly for disconnected client scenarios, i.e. mobile device. However, it’s not far fetched to have a local database which provides this support which then updates to the big mother database (cloud, etc) when connectivity is restored. One could still leverage the readily deployable nature of stored procs to provide the right business smarts to the local DB. Indeed, a tiered relational centric model vs typical tiered OO centric architectures which relegate relational technology to the last tier only :)
  • MS SQL Server 2005+ - This post includes the usage of the SS 2005+ "OUTPUT” syntax. As with anything, the underlying concepts I’m proposing could be reapplied to other platforms and frameworks (I’m not familiar but wouldn’t be surprised if Oracle allows something similar). T-SQL OUTPUT is just a nice syntactical convenience with mild potential performance benefit.
imageBusiness Example
To frame a case which demonstrates the need for side effects – take a look at the adjacent screenshot . The scenario is we’ve got a household with some people in it (aka members, aka clients). In this business domain only one person can be the sponsor of a household at any given time. Likewise there can be only one spouse set, the spouse which is not the sponsor. These designations are maintained as flags on the “Clients” database table. In this example, we’re exploring what needs to happen when the sponsor changes from one person to another. This can happen when the existing sponsor leaves the business system which grants this privilege, yet the spouse remains in the system and can therefore assume the sponsorship privilege and nothing else needs to change.
So, in the UI the current sponsor is Sgt. John Snuffy. To effect this desired change, the user would select the “Set Sponsor” button on the spouse entry (Mrs. Jane Snuffy). As is typical tiered design, this button fires a Business Object method SetSponsor(…).

By design, my Business Class methods tend to be fairly light wrappers around proc calls. For example (full source):
public void SetSponsor(string NewSponsorClientGUID, bool FixExistingPackageLinks)
{
  using (iTRAACProc Sponsor_SetSponsor = new iTRAACProc("Sponsor_SetSponsor"))
  {
    Sponsor_SetSponsor["@SponsorGUID"] = GUID;
    Sponsor_SetSponsor["@NewSponsorClientGUID"] = NewSponsorClientGUID;
    Sponsor_SetSponsor["@FixExistingPackageLinks"] = FixExistingPackageLinks;
    TableCache(Sponsor_SetSponsor);
    HouseMembers = HouseMembers; //for some reason OnPropertyChanged("HouseMembers") didn't refresh the Members Grid, i don't have a good guess but this little hack worked immediately so i'm moving on
  }
}
Line #8 above is the huckleberry.  It resides in the BusinessBase class… basically it fires the proc and then goes into the DataSet.Merge() logic explained below.
While we’re looking at this code, let me quickly divert to explain the “Proc” class . Nutshell: Proc is a convenient wrapper around ADO.Net SqlCommand. Among other things it does the SqlCommandBuilder.DeriveParameters() + caching thing that you’ll find in many similar wrappers like this (e.g. Microsoft’s Data Access Application Block - I just didn’t fall in love with their API and wanted my own spin). DeriveParameters() removes the dreary burden of all that boring proc parm definition boilerplate code prior to each proc call (add param by name, set the datatype, etc.) and just pulls all that out of the database metadata that already knows all that information anyway - brilliant. Therefore we get right to the point of assigning values to named proc parms and firing the query. SqlClientHelpders.cs contains the Proc class as well as all kinds of data helper methods that have evolved over several projects. I wouldn’t want to start a database project without it at this point.
iTRAAC is the name of the project I pulled this example from. iTRAACProc is a very light subclass that assigns a few common domain specific parms (e.g. UserID) before handing off to the base Proc class. Conveniently, the Proc class' parm[“@name”] indexer ignores anything that's not declared on the specified proc, so only procs that actually require these parms will receive them.
Ok so back to our scenario… Besides setting the flag on Jane’s record to indicate she is now the sponsor, we also need to remove the sponsorship flag from John as well as flip the spouse flag from Jane to John (other queries and reports depend on having those flags consistent)… and oh, by the way, we also want to log all of this to the audit table so there’s a historical reference of what changes brought us to the current state of a household.  We want to drive all of this from the database proc logic and once the database has changed we want the UI to magically update to reflect all these changes and additions (including the new audit record aka “Diary” in the UI). So this is where we’ve arrived at what I call side effects (maybe there’s a better term?). That is - corresponding to a relatively innocent looking user action, our desired business rules will drive various values to be changed and entirely new rows to be added that are not directly maintained by the user. This is not simple CRUD table maintenance, this is real business rules with all the crazy interconnections that must be supported :)

Update-proc example (full source):
SET @TableNames = 'Client'
UPDATE iTRAAC.dbo.tblClients
SET StatusFlags = CASE WHEN RowGUID = @NewSponsorClientGUID THEN StatusFlags | POWER(2,0)
                  ELSE StatusFlags & ~POWER(2,0) END
OUTPUT INSERTED.RowGUID, CONVERT(BIT, INSERTED.StatusFlags & POWER(2,0)) AS IsSponsor
WHERE SponsorGUID = @SponsorGUID
AND RowGUID IN (@OldSponsorClientGUID, @NewSponsorClientGUID)
Line #1 is pertinent.  By convention, all procs which need to participate in the approach I’m proposing in this post, must have a @TableNames OUTPUT parameter.  This is a CSV list of table names corresponding to each resultset returned from the proc (in sequential order).  This way, the proc *generically* informs the datalayer what must be merged into the client data cache (repository).

Line #5 above is cool - rather than reSELECTing the modified data...OUTPUT lets us leverage that UPDATE already knows what rows it hit. I dig it.  Back on the client side, the datalayer takes that PARTIAL (i.e. very column specific) result-set and Merges back it into the cache like so (full source):
//nugget: DataSet.Merge(DataTable) has become a real linchpin in the whole data roundtrip approach
//nugget: in a nutshell, update procs return a bare minimum of updated fields in a return resultset along with a corresponding CSV list of @TableNames
DataTable cachedTable = dsCache.Tables[tableName];
dsCache.Merge(incomingTable, false, (cachedTable == null) ? MissingSchemaAction.AddWithKey : MissingSchemaAction.Ignore); //PreserveChanges pretty much has to be false in order to count on what comes back getting slammed in
The Big Picture
The Big Picture
What this approach tees up is that your procs can drive an unlimited amount of side effects which can be granularly returned to the client side cache.

Since you can pick and choose exactly which columns are returned (via standard selects or OUTPUT clause) you can weave a fine tuned blend between exactly which fields are allowed to come back in the side effects and blast into the client cache vs what fields may have pending uncommitted user edits in the cache. That’s pretty cool.

View->ViewModel (MVVM) environments with robust declarative databinding, like WPF, really shine when you see all of these side effects immediately manifest on the UI just by bringing the data back into the BusinessObject(DataSet) cache (that the UI is bound to).  The procs are very much in control of the business logic and ultimately what’s displayed, yet without being coupled to the UI. Great stuff.

Additional perks in the code provided:
  • An interesting “union-like” benefit in the datalayer – I ran into requirements where the most appealing clean design was to modularize subroutine procs that would be called from other procs. Fair enough so far. On top of that I found need to return these field level data changes (aka side effects) for the same entity table, from multiple procs in the subroutine chain. e.g. Client –> Proc1 –> SubProc2 & SubProc3. The impact of burdening the T-SQL proc layer with capturing the multiple proc results and union’ing them together is ugly design. It wound up being very clean and convenient to defer the union of these multiple selects to the TableCache C# datalayer logic. The “union” effect is readily implemented by looping through the tables of the same name and using ADO.Net’s “DataTable.merge()” to apply each incoming rowset to the existing entities in the repository cache. Including matching primary keys in the incoming rowsets facilitates updates to cached entities vs inserts.
  • Handy initial client side rows – I should say, this next bit is actually a technique that’s struck me as convenient yet it’s not specifically dependent on the TableCache approach … these building blocks do all however play into each other to nicely address what I’ll call the “new row dilemma” … that is, one typically needs some blank rows to databind to when you’re creating a new record in the UI… but it’s often undesirable to physically manifest these rows in the database until you’re sure they’re really going to be committed… it really stinks to sacrifice data integrity constraints just to allow for initial empty rows… a typical solution is to DataTable.Rows.AddRow() on the client and leave the DB out of it until you commit fully validated rows… but now client code is responsible for initializing new rows. I hate that for a couple reasons. First, I want that logic in the procs, where I can evolve it at will at the database tier w/o needing to deploy a new client binary. Secondly, for client logic consistency, it’s much cleaner for new row logic to work exactly the same way as existing row logic. So the execution goes something like this:
    1. New row event on client generates a brand new GUID PK (Some form of very unique ID seem fairly necessary to allow the client to do this autonomously from the DB).
    2. But otherwise the client logic just flows into the standard “GetEntityByID” proc call, passing the new GUID none the wiser whether it’s new or existing… i.e. zero logic flow difference between new record and vs existing record, nirvana :).
    3. Of course this fresh GUID won’t get a row hit which conditionally falls into the new row logic where I return a “fake” row populated with whatever defaults I desire… take note, I’m not actually inserting a row into the table and then selecting that back out, I’m doing a select with “hard coded” values and no “from table” clause… that way I don’t insert junk data nor forsake constraints, but the new row logic is kept in the proc layer – beautiful.
    4. Lastly, when committing to the DB, you fire the typical upsert proc which checks if it's doing an insert or update by seeing if the PK exists and acting accordingly.
Dear reader what do you think? Am I nuts? Am I bucking the OO trend too much? Will the OO gods conspire to make me pay dearly for my database blasphemy :)

Going Portrait!

Do you realize how freakin’ cool it is to be able to hit ALT-F1 on a table in SSMS and be able to immediately view all of the output without futzing around in all the little result-set scroll bars!?! There’s 6 tables that come back from sp_help and now they have the room they deserve… the visual query plan tool is more horizontal so I have a feeling that’s going to take a little hit… we’ll see. Great for dual pane WPF dev in VS2010 too… with a typical visual pane up top, there’s now tons more room for raw XAML in the bottom … and XAML gets verbose in a hurry so this was becoming a critical annoyance for me.

And not only dev oriented activities, Outlook feels better too… and it’s amazing how many web pages seem like they were made for portait… so nice to see a whole page at once w/o a scroll bar.

Bottom line: THE most dramatic yet drop dead easy computer improvement I’ve done in a long time.

clip_image001

Transcoding Motion-JPEG (.MOV) to MPEG-4 (H264)



Newer Approach





I have a Panasonic DMC-ZS5 (dpreview, Panasonic) which creates .MOV files that contain the Motion JPEG (M-JPEG) “format” (for want of a more technical term).

In order to stream those videos from my IIS/PHP based photo gallery (zenPhoto.org), they must be converted to a more “web compatible” format like MPEG-4.  I haven’t found a more straightforward approach than direct batch conversion to another format… you can readily automate the conversion of say all the videos in a folder so it’s pretty much turnkey and ignore.

Update 2015-01-05: This is my current go-to:
for %v in (*.mov) do ffmpeg -i "%v" -vcodec h264 -acodec aac -strict -2 "%~nv.mp4"
Notes:
  • make sure to double up all the %%v if you put in .cmd batch file
  • ffmpeg is a very popular 3rd party command line util. I get mine from here.

Update 2015-07-18: cropping 3D movies down to single image
ffmpeg -i "in_movie_file.ext" -vf "crop=960:800:0:0,setdar=4:2" -vcodec h264 -acodec aac -strict -2 "out_movie_file.mp4"
  • obviously check the real resolution before setting the crop... just divide it by 2
  • "setdar" is the aspect ratio... i found it was necessary... one way to find it is with VLC CTRL-J on the original video

VLC will do this via a command line like so:
"c:\Program Files (x86)\VLC\vlc.exe" -vvv %1 --sout=#transcode{acodec=mpga,vcodec=h264,venc=x264,deinterlace,vfilter="rotate{angle=270}"}:standard{mux=mp4,dst="%~n1.mp4"}, vlc://quit

Notes:
  • I’ve had to remove the acodec=mpga for my iPhone MOV’s or else I get garbled audio.
  • I included the vfilter=”rotate…” for rotation syntax since it was so hard for me to find but only include if you want rotation.

However, I noticed that VLC chops off the last 2 seconds no matter what I do… it seemed a little better choosing a different vcodec but h264 is too rocking to use anything else.

So I wound up going with QuickTime as my go-to transcoder for now.  It doesn’t truncate any video and creates a slightly smaller output file than VLC.  The compression is dramatic and h264 does an awesome job with preserving quality… even while maintaining 1280 x 720 HD, a 100MB MJPG will go down to a 5MB h264/MPEG file.

Following code stolen from here and tweaked a little, automates the QuickTime COM API to convert a directory full of MJPG’s (see sample code for Chapter.8 > “BatchExport.js”).

There’s no reason why this shouldn’t be in PowerShell… it’d be interesting to see if it was any more readable.
//----------------------------------------------------------------------------------
//
//    Written by    :    John Cromie
//    Copyright    :    ? 2006 Skylark Associates Ltd.
//                                                                               
//    Purchasers of the book "QuickTime for .NET and COM Developers" are entitled   
//    to use this source code for commercial and non-commercial purposes.                   
//    This file may not be redistributed without the written consent of the author.
//    This file is provided "as is" with no expressed or implied warranty.
//
//----------------------------------------------------------------------------------
 
 
function unquote(str) { return str.charAt(0) == '"' && str.charAt(str.length - 1) == '"' ? str.substring(1, str.length - 1) : str; }
 
 
// Run from command line as follows:
//
// cscript BatchExport.js , , , , , 
 
var sourcePath, destPath, configXMLFilePath, convertFileExtension, exporterType, exportFileExtension;
 
// Get script arguments
if (WScript.Arguments.Length >= 4)
{
    sourcePath = unquote(WScript.Arguments(0));
    destPath = unquote(WScript.Arguments(1));
    configXMLFilePath = unquote(WScript.Arguments(2));
    convertFileExtension = unquote(WScript.Arguments(3));
    exporterType = WScript.Arguments(4);
    exportFileExtension = WScript.Arguments(5);
}
 
//sourcePath = "D:\\QuickTime\\Movies\\Birds\\Kittiwake";
//destPath = "D:\\QuickTime\\Movies\\Export\\Dest";
//exporterType = "BMP";
//exportFileExtension = "bmp";
 
// Sanity check arguments
var fso = WScript.CreateObject("Scripting.FileSystemObject");
 
var e = "";
 
if (!fso.FolderExists(sourcePath))
    e += "Source path does not exist : " + "[" + sourcePath + "]\n";
    
if (!fso.FolderExists(destPath))
    e += "Destination path does not exist : " + "[" + destPath + "]\n";
 
if (!fso.FolderExists(configXMLFilePath))
    e += "Config XML file path does not exist : " + "[" + configXMLFilePath + "]\n";
 
if (convertFileExtension == undefined)
    e += "No convert file extension supplied!\n";
 
if (exporterType == undefined)
    e += "No exporter type supplied!\n";
    
if (exportFileExtension == undefined)
    e += "No exporter file extension supplied!\n";
 
if (e != "")
{
    WScript.Echo(e);
    WScript.Echo("Usage:");
    WScript.Echo("cscript BatchExport.js , , , , , ");
    WScript.Quit();
}
 
// Launch QuickTime Player   
var qtPlayerApp = WScript.CreateObject("QuickTimePlayerLib.QuickTimePlayerApp");
 
if (qtPlayerApp == null)
{
    WScript.Echo("Unable to launch QuickTime Player!");
    WScript.Quit();
}
 
var qtPlayerSrc = qtPlayerApp.Players(1);
 
if (qtPlayerSrc == null)
{
    WScript.Echo("Unable to retrieve QuickTime Player instance!");
    WScript.Quit();
}
 
// Set up the exporter and have it configured
var qt = qtPlayerSrc.QTControl.QuickTime;
qt.Exporters.Add();
var exp = qt.Exporters(1);
exp.TypeName = exporterType;
 
// settings file...
var FileSystemObject =  WScript.CreateObject("Scripting.FileSystemObject");
var configXMLFileInfo;
 
if ( FileSystemObject.FileExists(configXMLFilePath) )
    configXMLFileInfo =  FileSystemObject.OpenTextFile( configXMLFilePath );
 
// if settings files exists, load it and assign it to the exporter
if ( configXMLFileInfo )    {
    var configXMLString = configXMLFileInfo.ReadAll();
    // cause the exporter to be reconfigured
    // http://developer.apple.com/technotes/tn2006/tn2120.html
    var tempSettings = exp.Settings;
    tempSettings.XML = configXMLString;
    exp.Settings = tempSettings;
} else  {
    //otherwise, get the settings from the user dialog and save them to xml file for subsequent runs
    exp.ShowSettingsDialog();
 
    var configXMLString = exp.Settings.XML;
    configXMLFileInfo = FileSystemObject.CreateTextFile( configXMLFilePath );
    if ( configXMLFileInfo )  {
        configXMLFileInfo.WriteLine(configXMLString);
        configXMLFileInfo.Close();
    } else {
        WScript.Echo("Unable to create config XML file : " + "[" + configXMLFilePath + "]");
        WScript.Quit();
    }
 
}
 
 
var fldr = fso.GetFolder(sourcePath);
 
// Regular expression to match file extension
var re = new RegExp("\."+convertFileExtension+"$", "i");
 
// Iterate over the source files
var fc = new Enumerator(fldr.Files);
for (; !fc.atEnd(); fc.moveNext())
{
    var f = fc.item().Name;
    
    // Filter by file extension
    if (!re.test(f))
        continue;
    
    try
    {
        // Open the movie and export it
        qtPlayerSrc.OpenURL(fc.item());
        
        var mov = qtPlayerSrc.QTControl.Movie;
        if (mov)
        {
            exp.SetDataSource(mov);
            
            // Strip file extension and compose new file name
            f = f.replace(/\.[^\.]*$/, "");
            var fDest = destPath + "\\" + f + "." + exportFileExtension;
            
            exp.DestinationFileName = fDest;
            exp.BeginExport();
            
            WScript.Echo("Exported: " + fDest);
        }
    }
    catch (err)
    {
        WScript.Echo("Error Exporting: " + fc.item());    
    }
        
}
 
// Tidy up
qtPlayerSrc.Close();