Archive for the ‘Linux’ Category

Quick and Dirty DVR Redux

Wednesday, December 19th, 2012

Dropbox is handy
It’s over a year since my blog post entitled “Quick and Dirty DVR” and I’ve been using and tweaking the code ever since. What I have now is a pretty autonomous and functional video recorder based around that core record script.

All code mentioned in this post can be found either in my scripts or lib svn repository. I apologise for the hard-coded paths…

What I’ve added

  1. Scheduling & collision detection
  2. File management
  3. Ad detection
  4. Remote scheduling

Automatic scheduling

The first thing was to get the script to find and record TV shows automatically. To do this, the script needs to know:

  1. A source of TV listings
  2. A list of shows that I watch
  3. The last episode number of a given show that has already been recorded
  4. When all subsequent shows are on, in the future

The first step is easy – just install the XMLTV package and set up a cron tab: 0 2 * * * /usr/bin/tv_grab_uk_rt --quiet > /share/tv/listings.xml. This will check TV listings nightly.

The list of TV shows is simply a flat text file – one show title per line. I took the decision to allow regular expressions in show titles when the show name starts with “like”. For example, QI and QI XL are largely interchangeable so the entry for QI is “like QI( XL)?”, meaning the “XL” part can be there or not but it will be considered to be the same show.

Then I just have to write something to work out which TV shows I already have. It’s very helpful to have your TV neatly organised for this. All my shows are in the path /share/tv/[name of tv show]/[name.of.tv.show].s[series]e[episode].[subtitle].[extension], so inside tv.func.php is the function getLastEpisodeOf($name,$dir) which returns text describing which episodes of that show should be recorded, e.g. “Peep Show after season 8 episode 2”.

Next the listings file needs to be parsed. For a while I had some very dumb parsing which went through each line of the subscriptions file and picked the first available episode of whichever show it was searching for. This was flawed as the scheduler wasn’t aware of other shows being recorded and therefore clashes in scheduling were inevitable. The most I could do was to detect the collisions between shows after the schedule was written and manually correct them.

After a bit of thought I replaced the scheduler with a new script which takes the order in the list of TV shows file as a priority for each show and checks that any new show attempting to be recorded does not clash with a show already scheduled. If this happens, alternative times are checked and used. The presence of “+1” channels makes this a lot more robust. If a show doesn’t have a suitable recording slot, it’s dropped from the schedule and a warning is produced in the crontab output as a comment.

This is an extract of my subscriptions file:

Fresh Meat
The Killing
Peep Show
Wallander
Bang Goes the Theory
Doctor Who
Sherlock
Top Gear
The Simpsons
like QI( XL)?
like Have I Got.* News For You
The Snowman and The Snowdog
The Royal Institution Christmas Lectures

And this is an extract of how the recording schedule looks for the Christmas 2012 period:


# Clash: The Royal Institution Christmas Lectures s1e1 Royal Institution Christmas Lectures 2012: The Modern Alchemist: Air: The Elixir of Life on BBC Four for 61 at 2012-12-26 20:00-21:01 with EastEnders s1e4556 2012-12-26 Wednesday on BBC One for 31 at 2012-12-26 20:30-21:01
# Clash: Tangled on ITV1+1 (Central) for 111 at 2012-12-25 16:10-18:01 with Doctor Who s7e6 The Snowmen on BBC One for 61 at 2012-12-25 17:15-18:16
# No suitable schedule found for Tangled

# Peep Show after Season 08 episode 04
30 22 23 12 * root /home/iain/bin/record 'Peep Show s8e5 Chairman Mark on Channel 4 for 31' # Jeremy is desperate to avoid living with Super Hans and rekindles his relationship with Mark's sister Sarah, moving in with her and her five-year-old son Joshy. In an attempt to deal with a damp patch on his bedroom wall, Mark launches a campaign to be elected chairman of Apollo House's freehold committee. Starring Robert Webb and David Mitchell. The final episode is tomorrow at 10pm.
00 22 24 12 * root /home/iain/bin/record 'Peep Show s8e6 Quantocking II on Channel 4 for 31' # Dobby is offered a job in New York by ex-boyfriend Simon, but is uncertain whether to accept the offer. She suggests to Mark they should go for a weekend break in the country, while Jeremy decides it is time to get a few things off his chest. Comedy, starring David Mitchell and Robert Webb. Last in the series.

# Doctor Who after Season 07 episode 05
15 17 25 12 * root /home/iain/bin/record 'Doctor Who s7e6 The Snowmen on BBC One for 61' # Matt Smith returns as the Time Lord, who is in mourning after losing Amy and Rory to the Weeping Angels and determined to avoid getting mixed up in the universe's problems. But a call for help whisks him back to Christmas Eve 1892, where a trio of old friends and a plucky young governess called Clara need him to take on a chilly menace that comes with the snowfall. Will he be persuaded to abandon his new life as a recluse and defend his beloved Earth once more? Jenna-Louise Coleman joins the Doctor as his new companion - although whether Clara is the same character she played in Asylum of the Daleks remains to be seen - with guest stars Richard E Grant (Withnail & I) and Tom Ward (Silent Witness).

# like Have I Got.* News For You after Season 44 episode 09
00 21 21 12 * root /home/iain/bin/record 'Have I Got News for You s44e10 Christmas Special on BBC One for 31' # Actor Daniel Radcliffe, perhaps still best known to most as boy wizard Harry Potter (despite having laid down his wand 18 months ago) is tonight's host, taking charge of Muggle team captains Ian Hislop and Paul Merton, and panellists Andy Hamilton and Sara Cox.
45 23 25 12 * root /home/iain/bin/record 'Have I Got a Bit More News for You s44e10 Have I Got a Bit More News for You Christmas Special on BBC One for 46' # Actor Daniel Radcliffe, perhaps still best known to most as boy wizard Harry Potter (despite having laid down his wand 18 months ago) is tonight's host, taking charge of Muggle team captains Ian Hislop and Paul Merton, and panellists Andy Hamilton and Sara Cox.
30 19 27 12 * root /home/iain/bin/record 'Have I Got News for You s44e11 Have I Got News for You 2012 on BBC One for 31' # A compilation of highlights from the past year, remembering how Paul Merton, Ian Hislop and a variety of guest hosts and panellists took on the big news stories of 2012. Last in the series.

# Homeland after Season 02 episode 11
00 21 23 12 * root /home/iain/bin/record 'Homeland s2e12 The Choice on Channel 4 for 91' # Feature-length conclusion of the second series of the Emmy award-winning thriller. Carrie thinks about returning to the CIA, but wonders if a career in the intelligence service is really for her, and Nicholas Brody meets with Faber and considers his family's future. Meanwhile, Saul is ordered to undertake a secret mission and Quinn makes a momentous decision. Last in the series.

# The Snowman and The Snowdog
00 20 24 12 * root /home/iain/bin/record 'The Snowman and the Snowdog on Channel 4 for 31' # Animated sequel to Raymond Briggs' classic festive tale The Snowman, telling the story of another youngster's magical Christmas. A boy's snowman and snowdog come to life at the stroke of midnight and take him on an adventure to the North Pole, where he and his new companions meet an assortment of colourful characters, including Santa himself, before returning home - where a wonderful surprise awaits.

# The Royal Institution Christmas Lectures
55 02 27 12 * root /home/iain/bin/record 'The Royal Institution Christmas Lectures s1e1 Royal Institution Christmas Lectures 2012: The Modern Alchemist: Air: The Elixir of Life on BBC Four for 61' # Peter Wothers explores the scientific elements using a periodic table made from audience members at the Royal Institution in London, to help uncover what the medieval alchemists knew about the air people breathe. In his investigation, he reveals these elements can be used to control fire, defy gravity and harness the power of a lightning storm.
00 20 27 12 * root /home/iain/bin/record 'The Royal Institution Christmas Lectures s1e2 Royal Institution Christmas Lectures 2012: The Modern Alchemist: Water: The Fountain of Youth on BBC Four for 61' # Dr Peter Wothers investigates whether drinking water can restore his youth, and discovers how exploding balloons could help solve the energy crisis. The presenter is also joined by Paralympic gold medal-winning cyclist Mark Colbourne as they try to find out what happens when two of the most reactive elements on the periodic table, caesium and fluorine, meet.
00 20 28 12 * root /home/iain/bin/record 'The Royal Institution Christmas Lectures s1e3 Royal Institution Christmas Lectures 2012: The Modern Alchemist: Earth: The Philosopher'\''s Stone on BBC Four for 61' # Dr Peter Wothers explores the elements within the earth and investigates whether it is possible to extract the world's most valuable minerals from them. He discovers how carbon dioxide can be turned into diamonds and attempts the ambitious feat of turning lead into gold. Joined by Professor Sir Harry Kroto, the pair find out what happens when you set fire to a diamond, and establish whether a member of the audience is worth their weight in gold. Last in the series.

Adding a comment line with the show name and previous episode makes it reasonably human-readable and tacking on a comment at the end of the cron line allows you to add in the episode synopsis. Works pretty well.

Smoothing the process

This is nice, but unless you have file name OCD, you won’t have a nice neat folder of TV shows. That’s why in the source of some of those scripts there’s reference to show_organise.php. This looks at any video file in a target folder, parses out the name, episode number and subtitle and then moves it into the TV shows folder in a sub-folder with the name of that show. It then writes a symlink to that file back to a “queue” folder so that there is one location to look for unwatched TV shows. Then is looks at the database file for XBMC and checks which of the shows in the queue folder have already been watched and it deletes the symlink for those shows. The queue folder now represents an unattended churn of just shows that haven’t been watched.

No one likes TV ads

Watching recorded TV is great for being able to fast-forward over the ads, but who wants to have to put their drink down and reach for the remote to do that? I found a reasonably reliable solution for skipping UK TV ads on the mythtv wiki. This ignores most ad detection tactics – looking for a network logo in the corner etc and takes advantage of a feature of UK commercial TV: All ad breaks are preceded and ended with tiny block of silence. The example script then takes the audio from a recording and uses mp3split (designed for splitting continuous audio into separate tracks) and uses the position of those splits to determine which are ads and which are the actual TV show. I tweaked this script and included it at the end of my record script so that it outputs edit decision list files which XBMC will automatically pick up and use in order to skip over ad breaks with no user intervention. Sweet, sweet sanity.

Remote schedule

This isn’t quite Sky+ though is it? I mean, with Sky+ you get an app which lets you schedule TV shows to record from your phone. You can however, get pretty close with a bit of Dropbox (or cloud storage service of your choice) trickery.

Step 1: Add your subscriptions file to Dropbox. I’m using the linux CLI client.

Step 2: Symlink from that file back to where the scheduler is looking, e.g. ln -s ~/Dropbox/subscriptions.txt /share/tv/subscriptions

Step 3: Monitor that file for changes. inotify is the swiss army knife of scripting when it comes for monitoring for file changes but it’s less straightforward if you want to set up what is effectively a service running constantly against a file. I did find a neat little script called when-changed, however which simplifies this massively. I just then have it set up so that whenever my subscriptions or listings file is changed, it re-runs the scheduler.

This is the command which I have set up:
/home/iain/bin/when-changed /share/tv/subscriptions /share/tv/listings.xml -c "/home/iain/bin/tv_priority_schedule | tee /etc/cron.d/tv-schedule /home/iain/Dropbox/tv-schedule.copy.txt" &

This checks the subscriptions and listings files for changes and runs the TV scheduler if it finds any. The output is a crontab file which it writes to cron.d but also creates a copy in my Dropbox. Now I can edit my listings in Dropbox on my phone and see the resulting TV schedule update a short while later, also on my phone.

Of course, if I wanted I could also have it automatically compress recordings and have them dropped into my Dropbox as well for remote watching, but I think that’s for a time when I have a fatter upstream…

Remote Wireless Music Syncing Android and Linux

Sunday, February 13th, 2011

Recently, an Android version ofย Winamp was released and included in its feature list was very useful wireless syncing with the Windows version of Winamp. Unfortunately, my file server at home doesn’t run Windows and Winamp under Wine is an unstable mess, so attempting to use that feature was out. Like a good Linux user, I didn’t take this lying down – we don’t sit around like Windows users waiting for someone to build the solution for us, we cobble something together using tools already available! What I decided to do was to emulate the mechanism that I came up with when I built my Car PC. Basically, the core of wireless syncing in the Car PC was rsync, combined with a little bit of logic to get a list of music track locations to pull over. By offloading some of that logic onto the server, this seemed possible.

First, I installed an rsync app for Android and tested it out. Rsync backup is just a front end for rsync, which is perfect because it lets you use all the normal rsync options, allows public key authentication (so you never need to enter a password) and displays rsync’s output to a log window. I wanted to pull files from my server so in my rsync profile I selected “rsync in reverse” and added the command line options “-vHrtDL --chmod=Du+rwx,go-rwx,Fu+rw,go-rw --no-perms --progress --partial” most of which is the default for a new profile, but I added:

  • -L, to follow symlinks
  • --progress, to display file transfer progress
  • --partial, to allow file transfer resuming

Android won’t know any of the new music files have been added yet, so they won’t show up in any music players (which all look up available tracks in the built-in Android media database), so you need to install something like SDrescan to run after the files have copied over which will magically make them visible in the music app of your choice.

That’s the phone side sorted out, I just need something sensible to point it at – I don’t want rsync recursing over my entire music collection wirelessly, it would take ages.

On the server, I’m running the (somewhat ageing) Firefly Media Server, which indexes and serves music for my Roku Soundbridge. I’d previously written a data abstraction class in PHP for the sqlite database it runs on, so I used that to produce a script (to run periodically under cron) which picks a list of albums from a playlist (created by another script) and creates a bunch of symlinks to them in the folder that I set as the target in my rsync profile, the idea being I have a regularly changing folder which one rsync command can look at to pick up new music.

I now have wireless syncing that works anywhere with an internet connection, as well as on the local network (unlike Winamp) and the idea of keeping a static rsync profile and using a source folder full of symlinks provides a cunning way of pushing content onto your phone. It would be quite easy to create something like a web interface to pick files on the server and have symlinks of them created on a single rsync source folder. On top of this, Rsync backup provides hooks for Tasker to link into, so you could set this up to run automatically at some opportune moment.

Proof of concept: Locating a remote machine using the Google API

Wednesday, October 27th, 2010

example output of the locate script

A couple of months agoย Samy Kamkar presented a cool hack at the Black Hat Conference which demonstrated that by using a cunningly constructed URL against specific internet routes, you could inject some javascript on their configuration pages and trick them into sending their own MAC address (the router’s own unique code) to a script which would use it to look up their address on Google’s API, thus telling the attacker a pretty good estimate of your physical location.

This was only possible because as well as photographing everything, the Google Streetview cars have been recording the locations of every wireless access point they encountered. By recording the signal strengths of certain access points by the location they were observed in it’s possible to do a simple triangulation calculation and get a pretty good estimate of where the access point is.

I found this pretty fascinating, so I created a little PHP script to use this trick, perhaps it could be used to help locate a stolen laptop, for example. The script works like this:

  1. Attempt to scan using the wireless network adapter for any nearby access points. Save their key details (BSSID, signal strength and most importantly, the MAC)
  2. Compile all this into POST data and send it to Google using libcurl. Even without nearby MACs Google’s location API will do a better job at locating the machine than the usual GeoIP services.
  3. Get a bunch of data back from Google, including longitude and latitude estimates and a street address. For ease this also gives you a Google Maps link

The accuracy depends on how many neighbours your wireless card picks up and how much data Google has harvested from the wireless networks on your road, but for most people it will be accurate almost to the right house number.

The script is on my SVN server for download

How to download your Facebook photos

Wednesday, October 27th, 2010

Facebook recently released a feature that allows you to download a static version of your Facebook profile which includes all your videos, photos, your wall and a few bits of periphery information. If you’re planning on quitting Facebook, this is great because it means you don’t lose anything and it’s also a nice offline backup of your Facebook info.

To do this, go to Account -> Account Settings and pick “Download Your Information”.

What this doesn’t get you, however is all the photos you’re tagged in. For this you need to go through the Facebook API. Here’s how to grab all those photos in the best possible quality in a bit of a hacky way. No idea if it violates Facebook’s terms of service – who the hell knows what they are any more?

First of all you’ll need a system which has wget, grep, sed and awk. You can get these for Windows and this should all work, but I’m going to assume you’re logged into something with a bash prompt.

  1. Find your Facebook ID. If like most people you have an alias for your homepage, click on your profile picture and you’ll see “id=xxxxxxxx”. That’s your Facebook ID
  2. Create a URL for calling photos.get in the API. Go to the API documentation page for photos.get and in the Test Console at the bottom, enter your Facebook ID for “subj_id”.
  3. Run the call. Click “Call Method” – you will then see a bunch of code on the right and you will have a URL at the top starting with https://api.facebook.com/method/photos.get?subj_id=. Click on it to open that page in a new window.
  4. Download the data. Either copy the file from the browser window (ctrl+a, ctrl+c, paste into a file) or run wget -O photos.json "<your URL>" to save your data in a file called photos.json
  5. Grab the images. You can now run cat photos.json | sed 's/,/\n/g' | grep src_big | grep http | sed 's/\\//g' | awk -F\" '{print $4}' | wget -i -. This will pluck out the URL of each photo you’re tagged in and download them to the current folder.

You’ll then see wget going crazy downloading all those photos and when it’s done you’ll have a copy of every photo you’ve been tagged in. After that you’re free to stick them into an album, edit them into your existing offline Facebook profile – whatever you fancy.

Recipe for a decent bash shell in Android

Wednesday, August 11th, 2010
  1. Root your phone if you haven’t already and install the latest stable CyanogenMod, which includes bash. If this step sounds scary, stop reading now ๐Ÿ™‚
  2. Install ConnectBot:
    Connectbot Market QR code
  3. ConnectBot is a fantastic SSH client, but it also has a local terminal emulator. Open it up and select “local” from the drop down list and give it any nickname.
  4. Open it up to see the local shell works. It will, it’s not bash though and it won’t have tab completion. Hit the menu button and disconnect.
  5. Long-press the connection and select Edit host.
  6. Find the “Post-login automation” option.
  7. enter:
    su
    bash
    export PS1="\w\$ "
    cd /

    This step automatically switches you into root mode (you will be prompted by the superuser manager the first time you do this), starts bash, sets your prompt to the working directory and then changes directory to root.

  8. Hit “OK” and then the back button to return to the connections screen. Select the connection to test if it’s all working.
  9. For quick access, you can add a shortcut to the home screen. Long press on the home screen, select “Shortcuts”, “ConnectBot” then the name of your connection.

Tada! A one-click root terminal in bash with tab-completion (courtesy of ConnectBot’s keyboard shortcuts). Much better than the Terminal app every other blog tells you to install ๐Ÿ˜‰

Temporary Twitter unfollow script

Thursday, May 27th, 2010

Update: This now uses the oAuth Twitter class, not the basic auth function library.

Do you ever get people you follow on Twitter who, for some reason have suddenly become really spammy? Normally they’re great, but maybe one evening they’ve come back drunk and are vomiting tweets all over their feed, or maybe they’re live-tweeting some inane reality TV show for the opening night or are taking part in an incredibly uninteresting meme hashtag.

Whatever the reason, you don’t want to ditch them entirely, just until they’ve stopped blabbing on about stuff you don’t care about.

So I made a command line script to get around this. If you want to temporarily unfollow someone, write:

unfollow mrspammy for 1 week

or for several users:

unfollow mrspammy,mrsspammy,mrspammystennispartner for 4 days

Due to the genius of of PHP’s strtodate(), the words after “for” can be pretty much any time period phrase that works with the “+” modifier from the current date. You can also specify exact dates:

unfollow mrspammy until next tuesday

or

unfollow mrspammy until 12 july

This then looks at users that were previously unfollowed, checks if you want to re-follow them by now and does so if required. Run it without specifying any unfollows to just get this behaviour. This means you can also put unfollow in cron (or whatever) to run periodically and it will automatically check if users need re-following by now.

This is written for Ubuntu. Any Linux will be OK, Mac too probably if you have the paths set up properly. You might even get it working under Windows.

To get it working, you need

  1. My Twitter library under svn
  2. The unfollow script
  3. PHP installed with libcurl

Put the PHP files in the same folder. You then need to:

  1. Edit $confpath in unfollow
  2. Create twitter.conf.php, containing two define() statements with TWITTER_EMAIL and TWITTER_PASSWORD in. These should contain your username and password.
  3. chmod u+x unfollow
  4. optional: Add unfollow to crontab to run at whatever interval you want
  5. optional: make this folder part of path, symlink to unfollow or in some way make unfollow callable without defining the entire path

And you’re good to go. Easy ๐Ÿ˜‰

Let me know if you end up using this or my twitter library at all.

Streaming media from Linux to a games console

Tuesday, February 24th, 2009

I’ve been using a modified original XBox to stream media from around the house since about 2003, however that same box is getting a little long in the tooth – high res videos with a lot of visual noise chug and stutter and even at the best of times the XBox isn’t doing justice to the higher resolution plasma we now have.

So what are our other options? We’ve got a Wii and two XBox 360s. If someone ends up releasing a game on the PS3 we might even get one of those as well.

Wii

This is doable but completely pointless compared to the original xbox. I’d need to transcode everything in to the lower-quality sorenson flash video codec (whatever format it was in, this will be lower quality) and it won’t output at an HD resolution.

PS3

I’m told this works very well as a media playing client and gets along much better than the MS products with other machines on the network (as you’d expect). I don’t have one though.

XBox 360

There are basically two options for streaming to an XBox 360: Setting up Windows Media Center on your Windows-based machine or creating a UPnP (Universal Plug ‘n’ Play) based media server on the network. This is Linux (Ubuntu 8.10) we’re talking about so option one is right out. There are a few UPnP servers for Linux around though. Here’s a breakdown:

  • MediaTomb (package name, mediatomb)

    This is a very slick service which is easy to set up, automatically scans your media and presents it in an organised manner. It supports video, pictures and audio with a variety of formats and has the ability to transcode between formats on the fly using VLC. Very cool. Works with everything except a 360.
    Next!

  • uShare (package name ushare)

    This is pretty cut-down. The video playback works fine but there is no rescan option at the time of writing meaning you have to restart the service for new videos to appear. Not good if you want to watch something the minute it arrives on the server. This can be partly alleviated by creating a cron job to restart the server daily, but it’s not ideal. The audio library on uShare is also completely useless. It seems to attempt to organise everything, completely fails and only lets you see a handful of tracks in amongst folders which don’t even contain audio. It would have been better if it had just let you walk through the folders like in video mode.

  • X360MediaServe (no deb package)

    This requires an old-fashioned manual setup as it’s not in any Ubuntu repository but works well. However it ONLY supports audio so is not suitable for my needs.

  • FUPPES (no deb package)

    Steer clear of this if you don’t like the idea of not only an old fashioned manual configure, make, make install style setup but having to google error messages in order to even get it visible from an XBox. It’s almost even worth just using uShare and x360MediaServe both at the same time so that at least you have reasonably hassle-free video and audio.

    If you do happen to have enough patience and you’ve managed to craft the config files into something workable it turns out that this is actually one of the better least rubbish solutions for streaming to 360. The video works and although you have to hammer on the vfolder.cfg file (which doesn’t get built for you) in order to get something workable going it does kind of do the job in a very basic and raw way. Given a bit more time and a decent setup script this might turn out to be quite decent server, but then given a rescan option and proper music support in uShare I wouldn’t even be trying this out.

  • TwonkyMedia Server

    Twonky sticks out in this list because although there is a native linux server available it’s locked to free trial only before you have to part with your readies. This proprietary nature of software releases is so foreign to Linux it left me reeling in confusion and I gave up on it before even getting it working. If none of that bothers you then it’s probably the best out of the entire list. I’m a man of principles though. Or a tight-arse.

  • Samba

    Inexplicably notable by its absence is Samba, SMB, straight-forward common Windows file sharing. Every OS on earth understands it and I’d been sharing media using it for years already so why doesn’t the 360 support it? No DRM control? Is it because the protocol has been tainted by FOSS under the evil hand of the EU courts? Whatever the reason, it’s pretty typical Microsoft.

At the time of writing I have FUPPES serving both video and music although I can’t map the 360 media player’s “Artist” and “Album” drill down options to it. After a lot of fiddling I got most video working with a drill-down directory list. I haven’t given up on FUPPES yet but the truth is that despite the lack of rescan and decent music support, uShare is still the better option for video streaming at the moment. Install via apt, edit a couple of lines in the config file and then that’s the whole thing set up. Sure you have to restart it to recache the media list but it’s still less effort than FUPPES is.

Conclusion

If you want to stream music, use x360MediaServe. If you want to stream video, use uShare. If you have a ton of patience or you’re a masochist, use FUPPES. If you want to stream both video and music and you have better things to do, use MediaTomb and get a PS3, or you could try both x360MediaServe and uShare at the same time. Bit messy though, right?

Oh hey – could this be the PS3’s first worthwhile exclusive? ๐Ÿ˜‰

Update: Experiment1106 on Twitter pointed out PS3MediaServer which is apparently the best solution for streaming to PS3s (and claims basic 360 support).