This is really neat, and possibly represents an interesting transition for desktop computing. Alexander Larson has a first prototype of Gtk rendering that converts to HTML5. The demo video shows the Gtk test app being driven from an unmolested firefox 4.
Phoronix recently published an article regarding a ~200 lines Linux Kernel patch that improves responsiveness under system strain. Well, Lennart Poettering, a RedHat developer replied to Linus Torvalds on a maling list with an alternative to this patch that does the same thing yet all you have to do is run 2 commands and paste 4 lines in your ~/.bashrc file. I know it sounds unbelievable, but apparently someone even ran some tests which prove that Lennart’s solution works. Read on!
The default fonts for git gui (aka gitk) in Ubuntu are down right horrible. Even Ubuntu 10.04 defaults to tk8.4, which doesn’t support font smoothing. Fortunately there is a simple way to fix this and make a whole bunch of applications look prettier all at once.
# sudo update-alternatives –config wish
There are 3 choices for the alternative wish (providing /usr/bin/wish).
Selection Path Priority Status
* 0 /usr/bin/wish-default 10000 auto mode
1 /usr/bin/wish-default 10000 manual mode
2 /usr/bin/wish8.4 841 manual mode
3 /usr/bin/wish8.5 840 manual mode
Then type ‘3’ and hit enter. Now you’ll be using tk8.5 by default, and miracle of miracles your eyes won’t be scarred by jagged ugly fonts in gitk anymore.
Livnat Peer just posted an interesting look at converting a large source base from C# to Java. This was done because when Red Hat aquired the company that wrote KVM, they also got a huge .NET management application that they wanted to run on Linux. It’s a pretty interesting look at the various approaches you could take, and how they were eventually successful.
C# on Linux is an interesting beast. I like C# better syntactically than Java. Properties are just too damn useful. Having to have lots of getFoo(), setFoo() in Java when we’ve got this perfectly good key on our keyboard ‘=’ that everyone has known about since they were 7 bugs me architecturally. It is a short coming that Java will probably never get past.
Mono, the open source C# runtime, was the only open source Just In Time Compiler (JIT) you could get your hands on a few years ago. That made it a huge boon to language implementers, and was the defacto runtime that people would play with and hack on to build scripting engines inside over other applications. It’s the reason you’ll see Mono specifically show up all over the place in the gaming industry. Since that time Java went open source, under GPL, and LLVM, which is under a very permissive license, really grew up. This gave developers interested in language design some options for VMs they could run on top of.
But, there is always another hand. Microsoft casts a long shadow over C# on Linux. The Mono project remains many years behind Microsoft on features, and many more years behind that on stability and performance. While I was working on OpenSim, I was continuously frustrated by how much worse the environment performed on Linux than on Windows. Any project that is written in C# will be relatively poor performing on Linux. The word relative is in reference to the same code on Microsoft .NET, it’s still 20 times faster than if it was in Python. Microsoft’s sword rattling over Linux infringing their IP ensured that the Mono community remained somewhat small and close nit, with no large organizations investing in it other than Novell.
Mono makes for some decent desktop Applications. I use three of them on a regular basis: F-Spot, Tomboy, and Do. I can’t function on a computer without Do any more. But I still have a personal grudge with Mono over a simple fact: I can’t watch Netflix Instant on Linux. There was this theory that because of the way the media framework worked that it was going to work “real soon”. That was 3 years ago… and I’m still waiting.
C# has the basic issue that Java had for a long time, it’s a vendor language. And that’s just a tough thing to really believe in, unless you have a sufficient reality distortion field. Java has finally transcended that. It took building a community process for future features and open sourcing the JVM. Google’s entirely parallel Java implementation for Android was additional proof that it’s no longer in the hands of a single vendor. And while Java remains far from perfect, if you are on Linux, and want performance, it’s a pretty decent approach.
Last night was our monthly MHVLUG meeting, and it also marked 7 years since our first meeting.
7 years… it’s kind of hard to imagine. I was also really touched, multiple times last night, by the waves of appreciation I got from folks in the room.
Last night was a perfect night, even though it started as anything but. At 4:30, Thor’s hard drive decided it was no longer a drive, and without a backup of the presentation, we were scrambling in the office to try to recover the drive, and come up with a backup plan, which meant finding another good base presentation online he could work from. I still hadn’t gotten an ack back from the library, so had to call to ensure someone would actually open the door, and we didn’t get locked out of our space. Pat called during setup, and reported that they were having a mail server meltdown at work, so he would try to get there by the end with the cake, but there were no promises. This was exceptionally more chaos than we typically have to deal with for a meeting, but it definitely meant the night was starting off off balance, and we were just working to try to get it back on track.
As the meeting was about to start, Bruce Locke interrupted and got up and said a few words about how much he and others appreciated the efforts I’ve put into the group. He then presented me with a set of gift certificates to our local beer mecca, that a number of members had gotten together and pitched in on. I was really really touched by that. While this is a labor of love, it’s very energizing to have such a tangible gesture of how much it means to others.
The talk was great. Thor worked well off the borrowed slides, and we had a lot of great questions from the audience. Sahana is a great project, and really demonstrates how much of an impact we can have on people’s lives as members of the open source community. Sahana is actively being used right now in Haiti and Chile to handle the aftermath of their recent earthquakes. We had at least one new face in the room, who had first attended our meeting last month virtually (over the live stream), and had come out for the face to face meeting. As that was exactly what I was trying to get out of the streaming, I’m really glad it seems to be working.
I was given another comment of appreciation from the floor when I let folks know about the upcoming meetings we had locked and loaded for the year. It is hard to get that far out ahead on the schedule, and it was great that everyone seemed to really be excited about our upcoming meetings.
The cake… was not a lie.
Pat showed up about 7:40 with the cake, just a few minutes after the lecture had ended. We had people hanging out and chatting until 8, then 15 folks came out of the palace afterwards. We had great conversation that went until 11.
As I was driving home, I thought to myself how perfect that all had really been. Good people, a great talk, and lots of good conversation. Really… just perfect.
A few years ago I bought an Oregon Scientific wireless weather station. It’s a nice way to keep an eye on what’s going on outside. The unit supports up to 3 of these remote sensors (shown here), and aggregates it at a base station. It doesn’t have a computer interface, so while it displays nicely in the living room, I can’t get access to that data.
Once upon a time I bought some 1-wire thermo sensors that I was going to wire that into my computer for data collection. After the house was hit by lightning, I got far more gun shy about running conductive cables from the outside to a computer.
It occurred to me recently that there was good odds that someone must have figured out to sniff that wireless communication. These aren’t complicated devices, and Oregon Scientific seems to be letting different generations of sensors work with different generations of head units, implying that they have some pseudo standard protocol. It would also solve my spark gap problem, as I could gather this data without any wires connected to the sensors.
After some googling I discovered that yes, these devices are operating in the 433 Mhz band, and yes, thanks to the folks at rfxcom, there is a unit out there which will receive this and spit it out in a reasonable usb interface.
Under Linux, this information can be very easily decoded with the heyu program. This is really more of a command line system for home automation, but it also includes the rfxcom protocol and the decoders for just about everything Oregon Scientific makes.
After plugging in the rfxcom device, it took me about all of 30 minutes to get heyu compiled and configured for my devices. The first time you run the heyu monitor you’ll actually get big hex strings which are the raw data. Once you tell heyu what kinds of sensors they are, you’ll get it decoded in much more friendly units. Here are the relevant lines in the heyu config file:
TTY dummy TTY_AUX /dev/ttyUSB0 RFXCOM ALIAS Sensor1 A1 ORE_TH1 0xB1 ALIAS Outside A2 ORE_TH1 0x83 ALIAS Sensor3 A3 ORE_TH1 0xCE ORE_TSCALE Fahrenheit
After configured and running, I now have sensor data being collected continuously:
gallifrey:~> heyu monitor 02/25 21:49:54 Monitor started 02/25 21:50:11 rcva func oreTemp : hu A3 Ch 3 Temp 58.6F (Sensor3) 02/25 21:50:11 rcva func oreRH : hu A3 Ch 3 RH 42% (Sensor3) 02/25 21:50:20 rcva func oreTemp : hu A1 Ch 1 Temp 43.7F LoBat (Sensor1) 02/25 21:50:20 rcva func oreRH : hu A1 Ch 1 RH 54% LoBat (Sensor1) 02/25 21:50:22 rcva func oreTemp : hu A2 Ch 2 Temp 34.3F (Outside) 02/25 21:50:22 rcva func oreRH : hu A2 Ch 2 RH 87% (Outside)
The next steps here are going to be gathering this into something that I can use for graphing. I’ve now got all the sensors I was looking for to be able to build my better thermostat brain. Next steps will be tying this all together and starting my graphing.
If you give presentations with powerpoint or openoffice slides at any regularity, it is well worth investing in a presentation remote so you don’t need to keep coming back to your computer to flip slides. It lets you walk around more normally, not having to worry about getting back to the podium/desk/table for the transition. That level of free wandering on behalf of the speaker makes the entire presentation feel much smoother.
Previously I had a targus remote that I got online. It, like all other presentation remotes I’ve seen, has a usb dongle which advertises itself as a usb keyboard. The remote triggers page up/page down, and maybe some mouse functions. This means it works on any computer, any modern operating system, with no additional software. While the targus was sufficient, it had been slowly dying over the last couple of years. It failed for me at Ohio Linux Fest, and when, even after new batteries, it failed before my Git talk, I figured enough was enough. I scoured amazon reviews, and decided to give the Logitech R400 a shot. It arrived last night.
Holy crap, this thing is amazing. First, and most importantly, the thing fits perfectly in your hand. It has that same kind of ergonomics of the Tivo remote, where your hand is perfectly relaxed holding it. It’s weight is enough to know it’s there and solid, and whatever surface material they used for it just feels touchable. The buttons are in the perfect places so that I realize pretty quickly that 5 minutes into my next presentation I won’t even know I’m using it any more. Whoever did the ergonomic design on the R400… bravo!
The remote is pretty simple, which is good. Page up / Page down, F5 (which is play presentation in open office and power point), and a screen blank function which works inside of a fullscreen open office presentation, though I have no idea what key it actually is. There is also an integrated red laser pointer, of pretty reasonable power. The other notable facts of the remote are the usb dongle fits inside the remote itself, so there are not 2 pieces to get seperated and lost, and there is an off switch. As this thing is going to live bouncing around in my backpack, so I always have it with me, having an off switch to ensure that accidental bounces don’t hit keys and drain battery is good. It also has a nice neoprene case, which makes that less of a worry.
I’m really happy about this presentation remote, and can’t wait for my next group presentation to give it a proper work out.
P.S. For another $40 you can “upgrade” to the R800 which has a green laser and a countdown time. That’s more than I need, but people love the green laser pointers.
Last Wednesday, we did a live stream of the MHVLUG meeting for my Git presentation. This was an experiment to try new ways of getting people engaged in the group. Most people seemed to think it went quite well, though there were dissenting views that didn’t like the stream quality. I think part of the dissenting view was because the expectations were that this would be a full on replacement for coming to the meetings, which I did not intend.
We used ustream to do the streaming, which has the advantage of working quite nicely with Linux and the Logitech 9000 webcam I’ve got (at some point I’ll do a detailed writeup on that). Ustream’s streaming app is written in Flash, which is quite clever, and means all you need is Flash 10 to start broadcasting.
My goals for this experiment were pretty simple:
- see if the tech got in the way of the experience of the people in the room. If it was too intrusive, the experiment failed.
- see if people connected to the stream that couldn’t make it. At our height we had 8 people on that weren’t in the room. That added to the 25 or so in the room.
- see if there was a reasonable interaction pattern between people on stream and people in the room. Pat was able to ask questions via Joe, which I answered back directly. Even with the 4 seconds of audio lag I think it worked quite well. It’s actually a quite interesting communication model.
- see if the audio pickup was in any way reasonable, which is was. I was really impressed by how good the audio was actually.
- see if the video was passable. This is where there was a difference of opinion. Some people wanted much higher quality here. My feeling was the quality was about what I was looking for. You got a sense of what was going on in the room, you could hear the speaker and the room well, but the slides were kind of hard to make out. The fact that my talk was diagram heavy exacerbated this.
The conclusion: streaming of meetings will probably happen from time to time from now on, based on speaker preference, as many speakers don’t want their presentation going beyond the room. The meetings are optimized for people in the room, as that live audience, and the interaction pattern there, is what people come out for, and why we are able to get good speakers (the first question I get asked when bringing in someone externally is what the audience size is).
We’ll see how it affects the group longer term, and if it exposes more people to what we are doing in the LUG. We’re nearly 7 years old now, and looking at how we use new tech to get the word out is always something to consider.
Last night we did the first, of what I hope will be many, MHV Android Hack-a-thons. The basic idea was to get folks together that are interested in doing android mobile development, and having others around they could bounce questions off of. We did it at Panera because they have food and wireless, though future sessions probably have to move elsewhere, like Barnes & Noble, because the 9pm closing time came a bit too early.
Turnout was promissing. Frank, Kershaw, and Muller all showed with their android phones and laptops, plus we got 3 other folks that just wanted to see what an android phone looks like. Frank and Kershaw both had the Droid, Muller has a google issued G1, and I’ve got my Hero. It was definitely interesting to see the differences across all of them, and supports my theory that there isn’t a straight road when it comes to android base platforms. The Droid did some things the Hero didn’t, the Hero did some things the Droid didn’t. A big reason for these differences is how modular Android is. You legitimately can replace any part of the core interface with your own code. HTC Sense, for instance, is a Home replacement. You can write your own. HTC also replaced the default mail, sms, contacts, and a few other things. Some for good (mail, contacts), some for worse (messaging power bug). But as a user you are empowered to replace the SMS system with a 3rd party app, which I did.
The evening started off with “oh, have you seen this yet?” which got a lot of knowledge cross shared. Frank’s starting a wiki page to try to keep track of that. I got out my laptop early and started working through the Sudoku example application in Hello Android. It’s a pretty good example that includes many of the widget systems as well as the 2D graphics API. I’m pretty impressed with the book so far. Frank and Kershaw spent some time getting the SDK installed and poking it, and Muller was focused on the Android Scripting Environment to do some python on the phone.
All of use except Muller are still a bit in the “ooo shiney” stage, as I’ve had my phone for a whole month now, and Frank and Kershaw have had theirs for less than a week. I suspect that future hack-a-thons will actually start generating a bit more code. I continue to be impressed by the API model for Android, and really look forward to working on applications on it. Yes, Java is not as nice and terse as Ruby, but at least I won’t have to write widget packing code. And that makes me a happy camper.
In a week Verizon is launching 2 android phones, the droid and the eris. Sprint will have the moment by then, which adds to their existing Hero. T-Mobile has the G1 and the MyTouch. For each carrier this gives you 2 phones to choose from:
- A hard keyboard phone (G1, Droid, Moment)
- An HTC phone with the Sense UI (MyTouch, Hero, Eris)
While droid has gotten all the attention this past week for being the first Android 2.0 device, I’m cringing a bit with what the reaction to that phone is going to be on the mass market. It’s being placed head to head against the iPhone, which currently defines usability on the market. From my experience on the 1.5 front, the difference between stock android (which the G1 has) and Sense on a 1.5 phone is night and day.
HTC really went to town to provide a very smooth experience to the user. This is an interface that had a lot of user testing and human factors put into it. This polish is what has gotten the Hero praise as being the best gadget of 2009.
It does not, however, provide a hard keyboard (and I’m not sure if there will ever be a cross over there). A lot of people are going to go to the droid because of that hard keyboard, and large screen, which is totally understood. But for friends that have been looking at the device I’ve tried to convince them to look at the Eris at the same time. I really think the Sense UI makes the phone more compelling. It adds this slickness to android that isn’t there, yet, in the base. If user experience is your top concern, it’s worth looking at one of the HTC Sense phones when you contemplate your android purchase.