JS Ext

Wednesday, October 31, 2012

Sharepoint wiki

I really dislike the Sharepoint built in wiki.  I find the WYSIWYG annoying.  When I try to highlight a block of text, it takes a few seconds for something to happen.  The background color of the text doesn't change though.  You just notice that the cursor position has changed.  If you hit Control-C to copy, then the background color changes.  When pasting the text, the page seems to randomly decided if it should put a carriage return.  If I put the cursor at the end of a line, the cursor disappears after a second and magically reappears at the beginning of the next line.  The arrow keys only work about 70% of the time.  It is very frustrating.

Tuesday, October 30, 2012

Maven's single-threaded pipeline

Maven has a single threaded pipeline.  When you tell maven to run the test phase, maven will run the compile phase then the test phase.  There is only one test phase.  It doesn't matter how many types of tests you want to do.  I tend to write unit and integration tests.  There are also static analysis tools that take a while to run.  With a single threaded pipeline, all of these tests run sequentially.  If any one of these steps fails, the entire "build" fails.  This is what a lot of people want.  They want immediate feedback about problems.

As your code base gets more complicated, so do your unit and integration tests.  Then, someone decides it would be a great idea to integrate a real application server into your build process.  Now, your war file gets deployed to a Tomcat server.  You wait for the Tomcat server to start, then you run a suite of REST calls, or maybe a browser simulator.  All of this happens as part of the build process.  If a webpage fails to render, you want to no as fast as possible.  With all of your tests running sequentially, your build now takes 3 hours.  3 hours is not immediate feedback.

I'm sure the maven people can come up with some complicated way of having multiple pom files and jenkins jobs that kick each other off.  To me, the right answer is to have a multi-threaded build pipeline that supports distributed phases.  In my opinion, the first build phase should only build your code.  If the build succeeds, that code should be uploaded to a code repository.  If any testing is done during the build phase, it should be unit tests only.  There should not be any integration tests or static analysis tests.  If the code will not build and is not self-consistent, then it is useless.  If it does build and is self-consistent, then upload it to the code repository.  If you use maven, this might not sit well with you.  Once you upload the artifact to the code repository, every artifact that depends on that version is now using it because the -SNAPSHOT pointer will return the code you just uploaded.  We also established already that locked snapshots don't work very well. Why would you upload the untested code to your artifact repository?  At the bottom of the snapshot vs milestone article, I talk about how milestones should be pointers.

First, the newly created artifact should have a pointer name similar to -LATEST_BUILT.  This pointer will point to the latest code that has build successfully.  Once that artifact is uploaded, the unique name should be put into a bunch of queues.  There should be one queue for each type of parallel testing that you want performed.  You can have a queue for integration tests.  You can have a queue for static analysis tools.  You can have a queue for testing your application within Tomcat.  At the end of each successful run of a sub-build task, a new pointer is created/updated.  For integration tests, its -LASTEST_INT_TESTED.  For static analysis tools, it can be -LATEST_ANALYZED.  If any sub-build task fails, a pointer isn't created and someone is notified.  The build flow defines a rule of what an overall successful build requires.  You can define success as requiring every sub-build task to complete.  You can split sub-build tasks into two categories: the ones that are required for an overall success, and the ones that are optional for an overall success.

This setup has a few advantages over the single threaded pipeline.  First, integration tests and static analysis tools can now run in parallel.  Second, you can fire off the next layer of integration tests while you are still fixing a static analysis tool issue.  Forgetting to remove an unused import shouldn't cut short your rounds of integration tests.  You can rebuild and integration test your war file while fixing your imports.  Third, your build pipeline can now support multiple platforms.  If you are writing a desktop Java application, you can have daemons running on Windows, Linux and Mac that run a test suite on each platform.  If you are writing an Android application, you can have a bunch of queues that simulate each phone.  Even better, you can have a lab of phones plugged into computers.  Each time you run a build, the test suite is executed on the actual phones!

Supporting multiple build tasks that run independent of each other could open up a lot of possibilities.  A setup like this could facilitate having a much larger automated test suite.  You could run more tests in less time.

Monday, October 29, 2012

Bluetooth headphones

My wife and I have been having issues with the ear buds that came with our phones while working out.  We have other ear buds but those get annoying as well.  They work fine when walking or hiking but they get caught when doing other activities.  We both wanted something wireless, but most of the Bluetooth headphones out there were more expensive than we wanted to pay.  Then I found the Motorola S305 Bluetooth Stereo Headset.  At $36, this was a great deal.  The headset has a good set of features.  I only wanted them for music, but they do support making and receiving phone calls.  All the controls are on the right ear piece.  The face of the ear piece has buttons for phone calls and skipping songs.  The outside of the ear piece has the power and volume buttons.  I haven't tried to make or receive any phone calls.  I was able to go to the gym without any problems.  Going on a hike or walking around the block worked out really well.  Overall, I highly recommend them.  They are great products for a low price.


Friday, October 26, 2012

Running out of temp space

There has been some confusion about the /tmp filesystem in Solaris.  The confusion stems from the fact that the filesystem is listed as 'swap'.  The fun starts when you run out of virtual memory.  The problem is a lot of people think swap and virtual memory are the same thing.

When you run out of virtual memory, then try to write into /tmp, you get an error message saying the filesystem is full.  Some error message are not very clear.  This error message is completely clear, but completely wrong.  The problem is less technical people will insist they know the real solution.  They think we should increase the size of the /tmp filesystem.  It took a lot of arguing to convince people that we didn't need to increase the size of /tmp.  The next thing they wanted to do was buy more ram, but that was a whole other argument for another day.

Thursday, October 25, 2012

MK802: Youtube network issues

My wife and I have started to watch a lot of Youtube shows.  While most episodes are short, others are longer.  CrashCourse episodes are over 10 minutes long.  TableTop episodes are over 1/2 hour long.  We watch the episodes on various devices.  The episodes view fine on my laptop on Google Chrome.  My Google Nexus 7 plays the videos just fine (notice a trend with the vendor?).  The Youtube app on the MK802 is having connectivity issues, though.  The app will download 1 to 2 minutes to the buffer, then it will just stop.  You won't notice it until the video hits that mark.  Then you get the spinning circle forever.  I have found that switching from HD to SD tells the Youtube app to redownload the video, but starting from where the connection was lost.  Then, 1 to 2 minutes later, it happens again.  I switch back to HD and start over.  It is a pain.  This happens with both of my MK802's, so I know it isn't a problem with that particular device.

Wednesday, October 24, 2012

MK802: MX Player will only play in S/W Decoder mode

I started trying to use MX Player on a wide variety of video formats.  I started to notice that playing some of the videos performed pretty badly.  That is when I noticed that MX Player was in S/W mode.  It would not run in H/W mode.  I even tried H/W+ mode.  Instead of a random sampling, I decided to try out some of the videos that we tend to watch the most frequently.  Some of those performed so badly that they were not even watchable.  The Youtube seemed to have no problem playing any video, so I know the hardware decoder can be used.  I tried other apps like VPlayer, Mobo Player, BS Player, ES Video Player and a few others.  All seemed to have the same performance characteristics of MX Player.

Tuesday, October 23, 2012

Searching for an outside security camera

I want to build a security system, but I am having trouble finding the right hardware.  I plan on using Motion to only record video when there is movement outside.  I prefer an IP-based camera instead of using a capture card.  Capture cards limit the number of cameras that you can use.  With IP-cameras, I can buy my cameras over time and add them to the system (who can afford buying them all at once).

Most of the home cameras that I have seen use an AC to DC adapter.  If I mount the camera under the overhand of my roof, then I need to put power somewhere.  Normally, you would run power inside of the attic, similar to an outdoor light.  If you do that with one of these cameras, then you are putting the AC to DC adapter inside the attic.  Attics are hot.  When AC to DC adapters get too hot, they tend to explode.  Explosions are bad for houses.  This means I would have to put outlets under the overhangs next to each camera.  Each of those outlets would be GFI's as well.  Nothing screams professional like having a big AC to DC adapter plugged next to your camera.

Monday, October 22, 2012

ASRock boot problems

One of my hard disks started to report IO errors.  Since I manually mirror files onto different disks, I know I wouldn't lose any data.  I started the process of adding a 3rd mirror for all the files on that disk, so that when it dies, I will still have two mirrors of all the files.  About two weeks later, I come home and the computer is unresponsive.  I try rebooting it, and the ASRock boot screen comes up, but it doesn't move on from there.  I tried powering it off for a few minutes, just in case it overheated.  It still didn't work.  Then I remembered the time I left a Wii disk in the dvd drive.  The bad hard disk must be preventing the computer from coming up.  I opened the case up.  I have 7 hard disks in this computer, so there was no way of determining which disk was the bad one just by looking at it.  I unplugged all the SATA cables and turned the computer on.  It beeped and the GRUB boot screen came up.  I turned the computer off, plugged in a single SATA cable and turned the computer back up.  I kept doing this until the GRUB screen no longer came up.  I took that disk out and it was the right one.  I guess ASRock motherboards are very sensitive to bad boot devices.

Saturday, October 20, 2012

MK802 and Bluetooth

So, it looks like the Android OS that comes with the MK802 doesn't support Bluetooth.  I started googling around and found an article that says you can install an unofficial version of Cyanogenmod 9 to get Bluetooth support.  It talks about some of the difficulties with installing CM9 and some of the problems people have reported.  Although I want Bluetooth support (I bought USB Bluetooth dongles!), I don't need it bad enough to go through that pain.

Friday, October 19, 2012

Every day is Black Friday!

I do most of my electronics shopping online.  I comparison shop, so I don't buy from a single retailer.  This has put me on multiple mailing lists to receive advertisements.  I have noticed an interesting trend.  Every day is Black Friday!  Historically, Black Friday featured the best deals.  Black Friday has been lasting longer and longer.  It lasted about two weeks this passed Christmas.  To inflate their sales, the various online retailers are trying to capitalize on the visceral reaction people have to Black Friday.  I'm getting sick and tired of these sales that aren't as great as they make it seem.

Thursday, October 18, 2012

Audit vs Automation

In a previous post, I talked about the levels of automation.  In this post, I will talk about how audit rules can prevent automation.  Lets assume you belong to a software development shop.  Every time there is a new version of software being developed, someone has to create a branch in revision control.  You have to create a new pom file.  You have to create new Jenkins jobs.  You have to configure the appropriate upstream/downstream relationships.

This process seems repeatable, so someone writes a script to create the new branch.  At this point, you now have Level 2 Automation.  Whenever a new version gets created, someone has to manually execute this script.  As your company grows, this process needs to run more and more frequently.  It runs more because you are releasing more versions of software, as well as having more software packages that get released.  Its time to step up your automation to Level 3.

You integrate with your ticket system to kick off the script every time someone puts in a ticket requesting a new version.  The script has to run as an application username instead of someone's personal username.  You put in a request to create new application users for Subversion and Jenkins.  This effort will let uses submit tickets and get their branches created automatically.  It seems like a good goal, but your request gets rejected.  For security reasons, only usernames associated with people can have write access to Subversion.  This is a requirement by your security team.  They want to be able to track back every single change to a person.  That seems like a reasonable requirement, but it is a roadblock to automation.  The audit requirement prevents Level 3 Automation.

For some companies, this becomes a question of Audit vs Automation.  Will you compromise your ability to audit your revision control system to allow faster setup of your development system.  Do you try and find a middle ground?  Do you allow application usernames the ability to create new branches, but not modify anything that already exists?  The answer to these questions will vary from company to company.  Unfortunately, the larger the organization is, the more likely that company has strict audit requirements AND high levels of automation.  I would be interesting in hearing how various companies have solved this dilemma.

Wednesday, October 17, 2012

Controlling an Android TV

Connecting an Android device to a TV is becoming more common.  Since Android is based on a touch interface, we need a way to interact with the interface.  Android does support mice, so the easiest thing to do is hook up a mouse to your Android device.  While this works, it makes it a little awkward when guests come over.  The conventional input device for anything hooked up to a TV is a remote control.  If it doesn't look like a remote control, people tend to get confused.  Though there are a growing number of people that have hooked up laptops to TV's, the general non-techie population wouldn't know what to do with a mouse.  This is ironic, since the general non-techie population knows how to use an Android device.  Below are some options for input devices:

This device seems very promising, but I haven't been able to find it in any store.  I also couldn't find any other videos about it.  I don't even have a model number.  The speaker did say the mouse couldn't be used for precise mouse movements.  Although I am very comfortable with a trackball, a lot of people don't know how to use trackballs.



This device is nice, because you can turn your TV on, then start using Android.  It has a keyboard and a mouse touchpad.  To use the mouse, you have to turn the device longways, so it feels more like a tiny keyboard with a mouse on it, instead of a remote control.  It is nice that this device comes with a touchpad because more people are used to touchpads.




The SMK-Link VP4750 is a Bluetooth mouse that seems pretty promising.  Since its bluetooth, it will mate with the Android TV relatively easily.  It has a touchpad that makes it easier for non-techies to use.  I have read reviews that say the device is not very ergonomic.  People have complained about it being hard to hold the trigger and use the touchpad at the same time.  This drag-and-drop gesture is very important in the Android world, since this is how to move from page to page.  The device is almost twice the price of the computer I want to control with it, however.

Ideally, what I want is a touchpad with a few buttons.  I found something that looks like what I want, but this unit is designed for car navigation systems.  If we could get a bluetooth device that looks like this, that would be perfect!




I ended up going with the Mele Fly Sky Air Mouse.  It appears like you wave the remote control around like a WiiMote.  The remote had directional buttons with an OK button in the middle.  If you flip the remote control over, you get a full QWERTY keyboard.  It ships from China, so I don't have it yet, but I'll be sure to post when I get it.

Tuesday, October 16, 2012

Crackle: Not available on MK802

Update: Crackle works on the 3rd generation MK802.  I decided to update this page, since it was one of the most viewed page on my Blog.

I have been searching for Android apps to watch tv shows and movies.  I found the app Crackle, but it isn't available on the MK802.  It seemed promising based on the CNET Review.  I will continue searching for good Android TV apps.




Local and remote music with Ampache

My wife and I have a decent MP3 collection that we have built up over the years.  My wife wanted to listen to the music while she was working but she works from home on a regular basis.  She didn't want to create duplicate playlists: one at home and one at work.

I did a little research and found an old project that I used to use.  The project is called Ampache.  Ampache is an open source lamp application that manages your MP3 collection.  It can create playlists that can be streamed to the music player of your choice.  It also contains a flash player in the event that your corporate firewall won't allow media players to connect to the internet.  Ampache supports a feature called Local Play.  Local Play allows you to stream the music to a media player that is on the local lan of your Ampache server.
Streaming to work was not an issue.  My wife could use the media player of her choice.  Ampache streams by sending a playlist file.  Every song on your playlist shows up in your media player, so you can skip or repeat a song natively.

For Local Play, I had to install MPD on my living room computer.  I was able to configure Ampache to connect to MPD and start playing music.  It took me a little while to get used to switching between the Flash player and Local Play.

The browse interface has a little quirk.  The pager uses ajax to switch between page one and page two.  If I click an artist, then hit the back button, I go back to page one.  It was a little annoying, so I started to open links in new tabs (I like tabs).  I end up with 5 tabs open, but adding items to the playlist will stomp on the playlists in the other tabs.  Its just something I need to get used to.

Users can create and share multiple playlists.  You can make a playlist for your various moods and listen to them in almost any location.  There are mobile apps/plugins available to allow you to stream music on the road.  If you are lucky enough to have an unlimited data plan, you can stream the playlists to your phone from anywhere.  The mobile apps don't allow you to act as a remote control for Local Play.

Ampache is a great tool for the home cloud.  The interface is pretty polished and easy to use.  It enables to you to access and organize your music wherever you are.  I do wish there was a mobile app that allows me to use Local Play.  I would also like a playlist pre-sync where I can download every song on a playlist over wifi so that I can go to the gym and not use my data plan.

Monday, October 15, 2012

MK802



I bought 2 MK802's.  They are amazing!  The device sets itself up just like a phone.  I was able to install most of the software I care about.  Here is my app matrix:

Google Market/Google Play Works
Youtube Works
ES File Explorer Works
MX Player Works
Facebook Works
Pandora Works
Netflix* Works
Skype Works
WeatherChannel Works
FlightTrack Not Available
Google Chrome Not Available
WeatherBug Not Available
iHeartRadio Not Available

The Netflix app did not work when I installed it.  I heard about that fact before buying the MK802, but I wanted to try it anyways.  A quick Google search will tell you that you just need to install an older version of Netflix.  Specifically, you need to install version 1.8.1.  Other than that, no other apps failed to run.  A few apps couldn't be installed, however.  Google Chrome said it wasn't available for the device.  The FlightTrack app that I bought for my phone (which installed on my Nexus 7) was not listed in the Play Store.  Instead, the "Pro Version" was available for $5.  That was a little frustrating since I paid $4 for the non-Pro verison.  WeatherBug said it wasn't available for my device.  I had to install the WeatherChannel app to get weather information.  The iHeartRadio app was not available.

I did not try to make a video call with Skype yet.  I used ES File Exporer to browse my SMB share and play HD video over WiFi using MX Player.  I was able to play HD Youtube videos.  There was no issues when I started watching, but if I skipped forward in the video, then the audio got out of sync with the video.  Pandora worked great.

I don't have Hulu Plus so I did not try it.  I have not tried many of the other apps that I use, like Dropbox, or any games.

These are great devices and are well worth the money.

Friday, October 12, 2012

Android TV

In a previous post, I talked about how the current TV developers don't write interfaces that work very well at 10ft away.  There are a growing number of mobile phone developers that can help a lot in this respect.  Those developers are accustomed to writing interfaces that are 4 inches small.  While sitting on my couch, I can hold up my phone so that it covers the TV.  At that distance, they are about the same size!  It is like a solar eclipse!

There are a few options to get Android running on a TV.  You can use an old cell phone (since you probably upgrade to a new Android every year or two).  If you have a spare computer, you can install Android x86.  A third option is to buy an Android mini-pc like the MK802, the Cotton Candy or the Raspberry Pi.

There are some disadvantages of Android on the TV.  Since no Android release has been created specifically for TV's, there are some missing features.  The biggest missing feature is input.  Android expects a touchscreen.  Most TV's are not touchscreens.  Since Android is built off of Linux, it is possible to use a mouse instead of a touchscreen.  This will work with most applications, but some gestures (like pinch to zoom) will not work.  Also, nobody wants to use a mouse for the TV.  HTPC's use remote controls.  I found a remote control with a touchpad (SMK Link VP4750), but I have not purchased it yet.  Even with a touchpad, Android apps are not easily controlled via a remote.  How would you pause a video in the YouTube app?

The Android platform already has a bunch of apps that I want for my TV.  I have paid subscriptions to Pandora and Netflix.  I can setup the Android devices in my living room and my bedroom.  I can access all my media as well as other popular apps like Skype, WeatherBug and Facebook.  With a mini-pc, all the components will fit in my pocket.  I can literally take my media center with me when I travel.

At some point, I am sure Google will create a new release specific for TV's.  Think of what Honeycomb did for tablets.  Soon, there will be an explosion of Android devices for the TV.

I have purchased an MK802.  I will let everyone know how it works out.

Thursday, October 11, 2012

Bad Piggies

I found out Bad Piggies was available for the PC, so I purchased it.  It installed fine on my Windows VM.  The game kind of reminds me of The Incredible Machine, but with an actual goal.  In The Incredible Machine, like Amazing Alex, each board has a random goal.  You are just trying to perform the goal of the board.  In Bad Pigs, the goal is less random.  You are trying to get the pig to the finish point.  It is kind of a cross between Angry Birds and Amazing Alex.

I have played about 20 boards at this point.  I don't know how I feel about the 3 star system yet.  In Angry Birds, you got between one and three starts depending on how many points you got (which measures how good you were at the board).  The annoying thing in Angry Birds was the game didn't tell you how many points you needed to get the different stars.  So, you had a goal, but not how to get that goal.  In Bag Pigs, the goals are more concrete.  You have two optional tasks.  You get one start for each optional task you accomplish in a run that takes the pig across the finish line.  This solves the missing 'how' problem with Angry Birds but replaces it with a sense that you are no longer measuring skill.  Each optional task is a puzzle in its own right, while the Angry Birds point system allowed you to quantitatively measure your skill.

The boards are more puzzle-like than Angry Birds.  In Angry Birds, there were multiple ways to solve many of the boards.  In Bad Pigs, you really have to find the way the game designers intended you to solve the puzzle.  There is very little leeway.  I felt like I was more trying to head inside the designers heads and less trying to solve the puzzle.

Overall, I do enjoy the game, and I do recommend it for the people that enjoyed The Incredible Machine.  I find myself playing Angry Birds more, however.  I enjoy having the ability to play around with the physics to solve the problem in Angry Birds, instead of trying to solve the problem in the exact way the designers intended in Bad Pigs..

Wednesday, October 10, 2012

Levels of automation

Data center automation is a great thing but I feel like there is a lot of confusion on what it means for a process to be automated.  I like to split the field up into levels.  By giving each level of automation a number and a description, it allows us to tell a story about how to build up automation.  It gives us a nomenclator when comparing processes.

Let's start wwith a fictional 10 step process.  The actual steps don't matter.  What matters is that it is a multi step process.  Let's assume that this process needs to run every day.  At first, you assign someone the task of executing all 10 steps.  This is what I like to call Level 0 Automation.  There really is o automation.  Someone manually executes the steps.

You decided that the process is taking too much time.  You are wasting one resource for two hours every day.  Add the fact that you have to cross train just in case your resource alls out sick or leaves the position.  You get someone to analyze the process and you realize that you can replace the process with a shell script.  You et someone to write the script.  Now, your resource just executes the script.  The process time is still two hours because the script requires input from time to time.  I call this Level 1 Automation.  The process is automates but it requires input.  Because it requires input, it still requires a person to monitor the process.  That person can multitask, however.

Next, you change parts of the process and you hired a more experienced programmer.  Now, your process doesn't require any input.  Since it doesn't require input, the process now takes half the time.  The resource still needs to execute the process.  This is Level 2 Automation.  Once the process gets kicked off, it is fully automated.  The process is fire and forget.

With Level 2 Automation, you still need someone to kick off the process.  There is no need to monitor the process, but you still need to find a replacement when your resource gets sick.  From here, you decide to schedule the task to run at 6am every morning.  Now, nobody has to do anything.  This is full automation.  This is Level 3 Automation.

Usually, Level 3 Automation is the end goal for automation.  A lot of times, people don't understand the differences between the levels.  They feel that once you get to Level 1, you are automated.  In future posts, I will discuss various roadblocks on the road to full automation.

Tuesday, October 9, 2012

Open potential

There is a lot of debate going on about the closed vs open nature of the iPhone and Android platforms for the phone.  Those debates tend to be limited to mobile devices.  If you plan on only using iOS on your mobile device, there are a lot of advantages to the closed platform.  What happens when you want to use iOS on something else?  How about a TV?  You can answer with "Apple TV", but that is more of an example of why closed platforms are bad.  They limit the potential of the platform.

In a closed platform, you have to wait for the vendor to move the platform to a different device.  You can't try to install it yourself.  The vendor has complete control.  For iOS, Apple released Apple TV, but it didn't go over very well.  Although Apple TV uses iOS, it doesn't have the look and feel of the mobile version.  The mobile market isn't available on Apple TV.  Since iOS is a closed platform, you can't try to install iOS onto a mini-pc that hooks up to the TV.  You are limited to what the vendor says you are allowed to do.

For open platforms, you don't have to wait for the vendor.  There are mini pc's available that come preloaded with Android.  There are also mini pc's that you can install Android onto.  These devices have the full market on them.  If a new competitor to Hulu or Netflix comes out, and they release an Android app, then you already have support for it on your mini-pc.  You don't have to wait for an OS update.  None of this work was done by Google, however.  Since Android is an open platform, hackers have been able to modify the source code to run on the mini-pc's.  Android TV's have a long road ahead before they become really useful to the non-techie, but they are getting there.

Its this open nature that promotes the platform as a whole.  You don't have to wait for Google to make decisions.  You don't have to ask for Google's permission.  You don't have to put in a support ticket to figure out how a piece of the operating system works.  You can just try it and see what works.  From here, the vendor can learn from the hackers (and maybe even hire them).  When they try to make the official version, they can do it faster and avoid many mistakes the community made.  This is what a real community looks like.

Writing a closed platform like Blackberry takes a short term thinker.  You get a functional phone but that is about it.  Writing a closed platform like iOS takes a medium term thinker.  You get a great platform that makes for one of the best for phones available.  That platform even extends to tablets, which are essentially large phones.  Writing an open platform like Android requires a true long term thinker, however.  You see the mobile space that iOS is trying to capture but you also look a the potential for other devices that are not similar to mobile devices.  You look beyond the phone, into the unknown.  A true long term thinker wonders how something can be used for a completely different purpose.  Hackers are already building Android TV computers.  People already install Android onto laptops.  Imagine having Android installed onto a HUD in your car.  Imagine Android on your house thermostat.

Monday, October 8, 2012

I miss Yoshi

I have started to watch the New Super Mario Brothers Wii episodes of Hank Games, so my wife and I started playing the game again.  We picked up where we left off the last time we went on a Mario Brothers kick.  I love the New Super Mario Brothers series because it returns Mario to some of the roots that made the series great.  The one thing I miss is Yoshi, though.  Super Mario World was a great game and part of what made it great was Yoshi.  Although only a fraction of the levels had a Yoshi in it, you could bring your Yoshi's with you after you complete the level.  Also, the secret level that gives you a Yoshi and a cape was awesome!  The is the only feature that I dislike about New Suprt Mario Brothers Wii.  You can't use Yoshi on most of the levels.  Now, whenever you get a Yoshi, you get really upset when you lose him, because you know it will be a while before you can get another Yoshi.


Friday, October 5, 2012

The importance of regular backups

I previously discussed my love of virtual disk snapshots.  This is a story of how snapshots can go wrong.  I recently found some free Windows 95 programs hosted on a Microsoft ftp server.  Some of the software were demo's but some were the full versions of some old school software that I wasted a lot of my youth on.  Naturally, I wanted to install some of these games.  Before installing new software into my Windows VM, I like to take a snapshot, just because I can.  Actually, its because some software can be detrimental to the operating system, so I want a way to uninstall that software without the uninstaller leaving behind dll's that slow my VM down.

I shut down my VM, then start taking a snapshot.  All of a sudden, the qemu-img program crashed with a Segmentation Fault.  I tried booting my VM and it wouldn't boot.  It claimed the disk was invalid.  I decided to apply my previous snapshot (from a week earlier) but the VM still wouldn't boot.  Luckily, I do perform other types of backups.  Since the VM disk is just a file on the filesystem, I have been copying the entire file onto another hard disk.  Creating a snapshot backup uses almost no hard disk space, due to the nature of the copy-on-write design of the file format.  Copying an entire virtual disk uses a lot of disk space.  Therefore, I didn't use this form of backup as frequently.  My previous backup using the disk copy was two months ago.  So, due to a Segmentation Fault, I lost two months of progress on Angry Birds!

Now, I have set up a weekly cron job that shuts down all my VMs, makes a complete copy of the virtual disk, then makes a snapshot.  If the snapshot fails, it copies the new backup of the virtual disk back.  It finally starts the Windows VM back up again (so my wife can always play Angry Birds whenever she wants).

Thursday, October 4, 2012

The 20ft Interface

There are many programs out there that claim to have a 10ft interface.  A 10ft interface is supposed to be an interface that is usable up to 10ft away.  The problem with almost all 10ft interfaces, is that they aren't usable at 10ft.  There are really only usable up to 5ft.  At 10ft, you have to squint.  The text is so small that the interface really isn't usable.  This happens because the developers of the software had to fix so much content on the screen but it wouldn't all fix.  They ended up shrinking the text and content so that it would all fit in the screen.  This layout is still easier to use at a distance than a standard desktop pc layout, but is still pretty hard to use.  Therefore, I define a 10ft interface as an interface that is usable up to 5ft away.  Most DVR and HTPC/Media Center PC setups fall into this category.  This includes TiVo, Comcast DVR, FiOS DVR, MythTV, XBMC and Windows Media Center Edition.

One of the reasons I created my own HTPC software was because of the unusability of the 10ft interface.  I decided to create what I call a 20ft interface.  A 20ft interface is an interface that is usable up to 10ft away.  20ft interfaces have some advantages and disadvantages over the 10ft interface.  First, the 20ft interface doesn't require you to squint.  All the text is readable.  Choosing a movie or a tv show is very easy because you can clearly identify every title on the screen.  Also, the movie or show descriptions are readable.  So, if you are looking for a particular episode, you can do that without heading a headache.  The problem with the large text size is the fact that large bodies of text won't fit on the screen.  It is usually easy to edit movie and show descriptions to be smaller.  What is really hard is handling movie and show titles that are really long.  They tend to get clipped of screen.  I did make changes to the software to automatically shrink the text for long titles to a limit, but there is a lower size threshold.  This allows you to easily read most titles, but strain a little bit on the longer titles.  Below are screenshots of my FiOS box and my HTPC software:




Another disadvantage of the 20ft interface is the lack of eye candy.  You only have so much screen real estate.  When you make text really large, most of the screen is used for text.  This doesn't leave room for album covers or movie posters.  With a 20ft interface, you mostly only see text.  Some text is highlighted, and you move up/down to navigate.  This interface style isn't as nice as XBMC, but it does have the advantage of speed.

Without the eye candy, all you are rendering is text.  Text is easy to render.  This means the CPU and memory requirements for a 20ft interface are actually less than a 10ft interface with eye candy.  Many set top boxes are loaded with eye candy.  There are various images and icons inside of the menu system.  There are various widgets that run.  There is a lot of content.  That content contributes to the slow speed and the difficult nature to use from 10ft away.  It is a very hard leap to write functional interfaces that work at distances of 10ft or more.  The current TV developers can't learn from me because my interface cheats to a certain degree.  My HTPC actually has two interfaces.  The 20ft TV interface is used to select what you want to watch.  That is it.  You can't do anything else.  You can't configure it.  You can't schedule anything.  It is just for selecting what you want to watch.  The second interface is an integrated website.  All configuration is done in that website.  Set top box developers have to put their entire interface on the TV, while I don't.

There is a new generation of developers that are handling these problems today, they just don't know it yet.  Because of the strides they are making, and the designs that they are creating, I'm actually ditching my 20 FOOT interface for a 10 INCH interface.  That's right.  I'm building an Android TV!

Wednesday, October 3, 2012

Maven repository retirement

What is the best strategy for retiring an artifact out of a maven repository?  You can base it off of time; you only keep 2 years of milestones and latest snapshots.  You can check based on how long since the last time an artifact has been accessed.  It is probably a good idea to keep every artifact around that is currently in production, just in case you need to fix something.  You also have to figure out how many versions of a snapshot to keep.

I would like to see garbage collection as a method for artifact retirement.  Let's start by defining the roots as everything that is currently used by production, qa and development.  We can also add implicit roots for every artifact that is younger than a set period of time.

From there, we can you a standard mark-and-sweep algorithm that starts from the roots and goes through all of there dependencies.  Anything that isn't marked gets deleted.  If you use the symbolic link concept for milestones then this answers the question of how many versions of a snapshot to keep.  Garbage collection will take care of removing all the snapshots that are too old but not in use.  Garbage collection has an added benefit of indexing the reverse dependencies.  The collector can save the path to the root when it marks an artifact.  This helps you answer the question "who is using this artifact?"

Tuesday, October 2, 2012

Private clouds are a matter of scale

I came across an article about some new technology coming out of Oracle related to cloud computing.  Although I'm not a fan of Oracle's cloud technology, I don't agree that the technology doesn't qualify as being a cloud.  While reading the article, I was surprised to read that private clouds are not even real clouds!  The author refers to private clouds as a way of managing resources in a cloud-like way, but are not clouds.  I completely disagree with that.  Why isn't a private cloud a "true cloud"?  The author defines a cloud as "...improving use and cutting costs by giving buyers as-needed access to a portion of a much larger pool of shared resources".   To me, private clouds still fall under this definition.  Why does the larger pool of shared resources have to belong to a different company?  Why does that company have to sell those resources to another company?

Some large companies have multiple divisions.  One central IT division can create a "large pool of shared resources" and "sell" those services to the other divisions.  If one division has a sudden need for more resources, the central IT division can decrease the resources for one division and increase the resources for another division.  To me, that is the key to cloud computing.  With cloud computing, the subset of resources that a "unit" is using should be elastic.  If you require a 20% increase in resources, you should be able to increase your capacity in a relatively low amount of time.  That "unit" using the resources does not have to be a different company.  They just have to be able to get resources if they need them.  This private cloud example matches what Amazon does, but Amazon does it in a much larger scale; selling resources to other companies.  A private cloud is still a cloud, but instead of one massive company selling resources to a lot of small companies, you have one division selling resources to a few other divisions in the same company.

Given the definition that the author gives, Oracle's offering could be a "true cloud".  The author says that a company buying a fixed pool of resources makes a mockery of cloud computing.  I think ignoring the size of a company makes a mockery of how business works.

Monday, October 1, 2012

Android RSS Readers

I have been trying to find an RSS reader for Android and I wasn't having any luck.  Like most technology, each person uses it in a different way.  I want to use rss feeds so that I can be notified about updates to websites or software that I like.  I don't want these websites to email me about updates, but I do want notifications.  In another article, I will be talking about the need for an open protocol to fill this gap, but for now, I will be using rss feeds.  For this purpose, the main thing I want is for my phone and tablet (and soon my Android TV!) to post a notification when a website has made a change.  Some webpages don't update very frequently and when I'm waiting for the latest version of some piece of software, I don't want to check the site over and over again.  Other notifications include new Youtube videos being posted.  The idea (the original idea with rss) is that I don't have to check anywhere.  I just get notified of changes.  I can then read it right away, or later if I'm too busy.

The obvious choice was to start with Google Reader.  I started getting frustrated with it pretty fast.  I was having issues with the update interval with Google Reader.  Google refused to let me force a refresh of my rss feeds.  Unfortunately, most Android rss readers are not actually rss readers.  They are 3rd party Google Reader apps.  This means they don't actually download the rss feed.  They use a Google api to list the available news articles.  This isn't unreasonable, since Google can download the feeds while the app just asks Google "is there something new?"  This reduces mobile bandwidth which is a good thing.  When Google Reader isn't giving me real time updates, though, this means this entire class of Android app are useless to me.

The next class of rss readers are the news readers.  These apps support rss feeds but their main goal is for you to read the news inside of the app.  They spend most of their time making the app user friendly and have lots of eye candy.  If I was going to spend a lot of my time in an app, I would want it to be user friendly.  This isn't what I want in a reader, however.  These apps tend to be overkill, and you end up paying for it in battery and data in the long run.  Since these apps are intended for you to read your news articles inside of them, they pre-download all of your news articles.  This includes images and other content.  Pre-downloading allows you to view your news offline.  This is handy for the subway commuter who wants to read news on their commute too and from work, but not for a notification system.

I finally found a true rss reader: RssDemon.  This was a nice tool, but it had problems reading some of my feeds.  Those feeds would come up with zero news articles available.  I don't know if it was a parse error or what.  It just didn't support some of the feeds I wanted to monitor.  Finally I found Sparse rss.  The name alludes to the fact that there isn't many features in this app.  It reads feeds and notifies you when there is a new item.  The widget wasn't that great.  It was hard to tell where one item stopped and the next started.  I haven't figured out how to scroll through the entries.  When adding the widget, a config screen opened up.  I don't how to get back to that config screen after adding the widget.  As a stop gap, I can remove then re-add the widget to access the config again.

Overall, Sparse rss is good enough for now.  It allows me to get notifications for rss items, which is the most important thing for me.