Then and Now

February 11, 2010

Just to prove how busy I’ve been, this is live in our private sandbox!

:


Making a Logo

November 19, 2009

In the midst of a lot going on we designed a logo for Blogcastr several weeks ago which we think came out great. We knew roughly what we wanted, something fun with a web 2.0 feel. I picked out Coolvetica which is a free Larabie font as the typeface. The font itself is a variation of the popular Helvetica which is simply one of the greatest fonts ever designed. We certainly aren’t alone in our love of the font either: http://www.webdesignerdepot.com/2009/03/40-excellent-logos-created-with-helvetica/. Being on a tight budget and on the recommendation of several friends I started a contest on 99designs.com. The process was a bit of work but it was actually a lot of fun. Many of the designs were great and the designers couldn’t have been more helpful. Here’s the finished product:

Blogcastr Logo

Progress Is Good!

September 18, 2009

In my copious free time I continue to trudge forward with Blogcastr and I’m having a blast. Since I last checked in I’ve added way too many things to list. Most notable perhaps are a dashboard, various post types (text, image, comment, repost), comments (including Facebook connect and Twitter sign in support!), subscriptions, and avatars. Oh and did I mention that everything is working in real time thanks to the magic of the thrift/ejabberd backend. Oh yeah!

My Blogcast

I think it’s also worth mentioning a few great Ruby/Rails gems I’ve been working with . I moved from authlogic to Clearance for authentication which took about half a day but was totally worth it. The reason for the switch was because I was using another Thoughtbot gem called Paperclip for image uploads and it was awesome! It seems like the Thoughtbot team puts out some quality code. For Facebook connect I’m using Facebooker which gets the job done and that’s about all I can say on that. For Twitter I’m using the Twitter gem by John Nunemaker. which is simple and thus something I also like. I did add a patch to it that allows for unauthenticated access to the parts of the Twitter api that allow for it.

So what’s on the horizon? A few minor features in the pipeline but those should be easy. Right now I’m debating investing some time in making the UI look nice versus adding live video. Yes you heard right LIVE VIDEO! So stay tuned.

Almost Ready

August 15, 2009

I’ll keep this short and sweet. Things have been quite busy over the last several weeks, I spent some time doing some contract work for a startup here in NYC which took me away from Blogcastr. I have finally been able to spend a large chunk of time on it over the last several days and it’s really coming along. I’ve integrated Thrift into the stack and now Rails and Ejabberd are playing nicely. A little more work on the front end with Strophe and we’ll have ourselves a nice little beta here shortly. Simple but really cool… Almost ready I’m excited!

It’s Alive

July 8, 2009

So it’s been about two weeks since I’ve started my Rails odyssey and things are chugging along. A couple days ago I passed my first XMPP message to a blogcastr window and snagged a picture of it:

Blogastr First Post

I’ve got a bunch of major pieces working together now and I think things are progressing nicely. I’ve already accomplished my goal of learning more about the current state of the real time web, but I think there’s still more fun to have here. More posts are sure to follow.

Ruby on Rails

June 26, 2009

Rails

I’ve spent some time working on my latest project blogcastr, which I briefly mentioned previously, and I’ve decided to do something unexpected. I’m going to learn Ruby on Rails. I had started coding a rough mock up of blogcastr in php and became a little frustrated. I’m not a php expert by any stretch but I know enough to get by. I had coded a site that I worked on with my brother in php and I actually rather enjoyed the experience (the site was relatively simple). So why am I making this decision. Well the main motivating factor is agile development. I want to create something as fast as possible using as little code as possible. It was clear that with php I wasn’t coming close to that goal. I did look at several php frameworks (codeigniter being my favorite) but ultimately I decided to take two weeks and try to learn both Ruby and Rails. So far so good. I think I’ve learned the basics of each in two or three days. I just bought two books that I plan on reading to flush out my knowledge and that should take another week or so. I haven’t felt this excited about learning something new in a while which feels great.

My biggest concern in choosing Rails is performance. Everywhere I read about performance related issues popping up and scalability problems. For me this is a tough pill to swallow. I’m a C/C++ programmer and we tend to take performance very seriously. In the world of Web 2.0 no one really seems to care at least until the shit hits the fan. Why? Well I think I understand the problem and it’s essentially two things. The first is that the Web 2.0 world moves fast, so fast that projects you start today may not make sense to start a year from now. In this sort of fast paced environment just getting something that works is WAY more important than spending extra engineering cycles on making your site fly along. For Rails in particular satisfying changing customer needs is certainly part of this as well but it’s really the same beast. The second issue is that of distribution. Web apps are distributed and consumed in a far different manor than say an OS or a conventional video game. If my C application has performance problems every user will feel the pain regardless of how many users I have. If my web app is hurting it’s probably because I have a lot of people using it. For some reason that doesn’t seem like a bad problem to solve. I’m sure there will be performance related issues I hit because of Rails. It’s my hope that the framework is flexible enough to spend a little more time on those places that need it and break free from the framework when required. We’ll see if that happens.

My initial reaction to Rails is bliss. I really like it a lot. I love that I can change from Sqlite to MySQL to Postgres by just changing a few lines of code. I love that the framework knows so much without any required configuration. I love the MVC design which for the first time makes perfect sense to me (it fits way better than the iPhone SDK). I have plenty more to learn but I think that’s a good thing. Hopefully taking a little bit of a diversion will pay off in the long run. My bet is that it does.

blogcastr

June 20, 2009

I spent today setting up bind with dlz support as previously described and I’m happy to report it works great! I’m using the postgres database driver and created a simple db using the bind-dlz documentation. The logging does leave a little to be desired, bind kept exiting on me and it turned out it was simply because I hadn’t set the SOA record properly in the db. A message telling me this would have been lovely:( Anyway… I do find it really cool that I set up a dynamic dns server in just a few hours. How I love open source! I did cut a few corners namely the bind-dlz documentation recommends using database replication across the two nameservers. For those that don’t know, this isn’t exactly one of Postgres’ strong points. So I’ll come back to that later in all likelihood.

I also spent a little time today learning about Apache’s mod_rewrite. The reason being was that I needed some way of handling the www subdomain dilemma. I decided to send a 301 Redirect for any request to a www subdomain. This is the opposite of what many sites do, though wordpress is an exception. I decided it was just more consistent behaviour and for this particular setup the www subdomain was extraneous. My glorious mod_rewrite rule is below:
RewriteCond %{HTTP_HOST} ^www\.(.*)blogcastr.com$ [NC]
RewriteRule ^(.*)$ http://%1blogcastr.com$2 [R=301,L]
This just redirects any www blogcastr subdomain to that same domain without the www. If it looks confusing welcome to mod_rewrite:)

I guess this a good time to introduce blogcastr! This has been my little pet project for the past couple of weeks. The goal of it is for me to learn more about real time web protocols and get my feet wet in the web world a little. I’ll have plenty more to say about it as it comes to life. Until then it’s very much in stealth mode.

DNS, bind and dlz

June 17, 2009

I’ve spent a little time learning about bind, which is the standard DNS server used on the net. My reason for doing so is that I’ve been working on a simple web service (that’s another post) and I want to implement a feature similar to one found in WordPress, Blogger and Tumblr which allows users to get their own subdomain (i.e. mrushton.wordpress.com). Now I actually don’t like how each of these services has implemented the feature, you get back valid dns responses for subdomains that don’t exist. I don’t particularly like that, so I’m trying to go about things slightly differently which requires that I use a bind extension called dlz (dynamically loaded zones). This stores zone information in a database backend which makes everything completely dynamic. Hopefully by weeks end I can find enough time to get this setup working. More fun!

Bar Camp NYC 4

June 3, 2009

All day Saturday I spent at Bar Camp. For those that don’t know, Bar Camp is a free community driven tech gathering. Sessions are held throughout the day by those in attendance and you are free to roam from session to session. It was my first Bar Camp and I didn’t have a great idea what to expect. This, coupled with the fact I found out about it at the last minute, meant I didn’t prepare to lead a session. Looking back on it that would have made the experience more enjoyable and I plan to do so next time. Despite this I still had a great time. I met a bunch of interesting people and took part in several worthwhile sessions. There were several rather bland sessions (based on topics I even had an interest in) but I still applaud the session leader and really have no one to blame but myself for not taking the initiative to lead one of my own. The best most insightful session was about Google Wave which had just been announced 48 hours earlier. It was led by an employee at drop.io who did a terrific job and had several interesting insights.

I did have a few high level observations about the event. Twitter is officially everywhere which isn’t surprising but is nonetheless noteworthy. I admit to tweating a few times from my iPhone and enjoying it. I will add that I was also monitoring the #barcampnyc4 hash tag from my iPhone while at the event. I don’t think doing this added much value to the overall experience, it did feel cool though. Perhaps that’s the point. The big topic that everyone seemed most excited about was Wave. I didn’t hear one mention of Bing but people seemed really genuinely excited about Wave. I think this bodes well for its future. I was a little underwhelmed about the number of bleeding edge sessions (with Wave being the clear exception). While there was a session on Unix command line basics there were none to be found about the real time stream for example. A little disappointing but again who am I to complain. Overall a great event and one I will go to again!

Git

May 20, 2009

One great tool I forgot to mention in Cool Technologies was git. Git is a decentralized SCM developed by Linus Torvalds. for Linux.

My experience with SCM software has been largely positive which I feel is definitely the exception. At Empirix when I first arrived they were using SourceSafe. It was clear everyone hated it and a short time after I started the switch was made to Perforce. I was really excited about this given how Linux friendly Perforce is, especially in comparison to SourceSafe. I quickly got used to the command line interface and used it almost exclusively. I will say that the GUI interface is amazing and I did use it frequently to search for changes, check diffs etc… At Empirix we were working relatively closely with the bleeding edge powerpc branch of the Linux Kernel and they were obviously using git. So I had some experience with the basics of it but we kept the Empirix kernel under Perforce and did all our development in that environment.

When it came time to determine what SCM software to use with Movolu I ended up deciding on using git. Perforce wasn’t an option because of it’s licensing which essentially left the choice between SVN and git. I had always wanted to learn more git and I thought the distributed model was just too cool. Distributed is absolutely perfect for open source projects but I really thought it would work great for a small startup like Movolu as well.

In the end I had a few general thoughts after working with it. The first is that the branching etc… is quite nice. With Perforce it’s a little bit of an adventure each time and on one occassion even resulted in a pretty major mistake. I did think git was a little overly complicated. I think simplicity is majorly important and hands down git was more complicated and more difficult to master than Perforce branching aside. I don’t think this has anything to do with the distributed model either. I also really missed having a top notch gui interface for looking at file revision history etc… I think I would use it again for my next project but if Perforce changed their license I think I would go with that (I realize it’s free up to a certain number of users but I would just fear lock in).


Follow

Get every new post delivered to your Inbox.