Monday, November 12, 2007

Working with Gliffy

For those of you who haven't heard about Gliffy, let that come to an end now.

Gliffy is a free online diagram creation tool built in Adobe Flash. This means its entirely cross-platform, which makes it easier for those multi-OS households and organizations to do development. No download is required (except possibly for Flash Player, if your copy is ridiculously out of date). Free accounts come with 3 private documents, and unlimited public documents. You can pay $30 for a year account, or $45 for two. This will net you unlimited private documents, priority support, and unlimited image uploads, as well as freedom from the occasional ads.

Editing in Gliffy is really easy; you have a menu on the left with categories of basic shapes, as well as standard icons for major diagramming applications (UML being the one I use the most, but there are also icons for computer networks, user interfaces and building floor plans). Just drag, drop, and customize. Text fields are built into objects where applicable, or you can add your own. Once you've lined things up the way you like, you can group clusters of objects together with a simple keyboard command (Ctrl-G). If the right shape or icon for your application doesn't exist, you can upload an image or build a new shape out existing ones, and group them.

Gliffy keeps track of your version history, so its easy to go back to an older version of a diagram if you added something that didn't work. There is also excellent integration into blogs and wikis (Confluence users, take notice!), and documents can be tagged, shared and collaborated upon.

For me, this tool has been incredibly helpful in my software engineering courses. I'll probably also wind up using it at work to create simple flow charts for how the new blog software we're implementing. It has to be, by far, the easiest tool for this kind of job I've run across.

POINT OF BALANCE: Like all applications, Gliffy has a few bugs. I've had a few scripts hang on me before, which either required a page refresh, or simply navigating back to the Gliffy login site, and logging in again. I would be cautious before trusting any critical data to ANY free online service, since it isn't your machine and you aren't paying anyone for the service. That said, if you just need to make a diagram, consider Gliffy.

Friday, September 28, 2007

The Death and... Death show (part 2)

As I began to detail in a previous post, several of my computers have suffered hardware failure. Since that post, even more machines have broken. Here is the list of my woes:
  1. my old laptop: despite replacing the RAM, memory problems persist. I've spoken with colleagues, and they say it might be the RAM connector ports (which I cannot fix).
  2. Larkin: Still down with (I assume) motherboard failure. I can fix this, but I need to confirm the problem and buy the part.
  3. Berman: Not dead by any stretch, but the built-in iSight has given out for no apparent reason. Under warranty, but I can't afford to lose the use of this system right now.
  4. the SquareOne: No longer lets computers on the local network connect to the Internet. Tech support says this, too, is hardware failure. Its under warranty, and they can either replace it now, or upgrade it to the Generation 2 in a few weeks (at a discount). I'm thinking I'll go for the Gen 2... I could use the WiFi and extra processing power.
  5. My work computer: In an attempt to increase the RAM and a second harddrive, my work system has lost almost all its usefulness; no more office, or Creative Suite, or any other program I installed. I've seen more computer guts lately than I care to recall. I can't even put a Kubuntu install on the thing... good thing most of my work can be done with online or open-source programs.
  6. Pearl: My Windows system is still holding on strong, but I think the Javascript engine is corrupted. I can't get various webpages that use .js to display properly (regardless of which browser I use). Not a well-documented problem.
I had even lost the use of the modem for a time (Comcast turned off our Internet by mistake; took 3 days to fix). This led me to believe perhaps this apartment had some kind of static field or techno-poltergeist or something. But with the repair of the modem and the problems at work, I think it might just be me.

More as it develops.

Friday, August 31, 2007

The Death and Resurrection Show - part 1

As you may have gathered if you follow this blog, my laptop died a while ago. This prompted the introduction of Berman into the house, and my recanting of all things anti-Mac. The particulars of this Dell Inspiron 8100's death involved nastier and nastier blue-screens until finally nothing would start up. Keep in mind this is a 6 year old computer, and has had its hard drive, DVD drive and monitor replaced in its lifetime. It seemed like its time to go.

Further investigation into the BIOS showed me that the system memory was off (200 MB. Not a power of 2. Problem.). It seemed, then, that the RAM was the source of the problem. Now, RAM can be removed, replaced and upgraded, so once I downloaded the manual for the Inspiron 8100, I was able to get in there, and figure out what chips I needed.

eBay is my friend. I found two 256MB PC133 144pin SODIMMs for a pretty reasonable price (~$50), and within a few days, them arrive. In the meantime, I'd pulled the damage RAM chip, which proved that that was indeed the problem (booted up fine after, but ridiculous slow; that's 128 MB for ya). With all the anti-static precautions I could muster, I put in the new chips, doubling what the RAM used to be, and bringing my old laptop back to life, for at least a little while longer.

Once I make sure I have all the files on it backed up on the server, I'm considering installing Linux on it (now that I know its so easy). Unfortunately, even with doubled RAM, its still not powerful enough to run the Kubuntu LiveCD at any degree of speed. I was able to get it up and running, proving that at least the major drivers worked, but it was very tedious, and I didn't have the time or inclination to do much further testing. More planned for part 1.5.

What I haven't had the time to mention until now is the untimely death of Larkin, the system on which I just installed Kubuntu. I was using it that morning, turned it off, then when I tried to power-on, nothing. Well, the fans started up, but there was no POST (Power-On Self Test, those first beeps before the screen does anything), no BIOS, no signal of any kind to the monitor. Inspection of the innards of thing showed nothing overtly wrong, like a disconnected cable or charred patch. Talking with the system's builder, I was able to determine that it is likely a problem with the motherboard. Fortunately, this is an easy fix, and comparatively cheap. I just need the time to confirm, and then do the disassembly and rebuild. This is planned as part 2.0 of the Death and Resurrection Show. Stay tuned!

Special thanks to the Killing Joke for the title.

Obligitory Blog Day post

The massive interwoven mesh of the blogosphere tells me its time for all good little bloggers to jot down their 5 recommended blogs for the year. This is my blog day, so I will take my favourites.

  1. Lifehacker - Just got turned on to this one, but its absolutely awesome. From Firefox Extensions and Ubuntu hacks to sleep-enrichment tips and guides to multitasking, this blog is points me to at least one new tip, trick or tool a day.
  2. lo-fi librarian - a UK-based law librarian who posts a list of new tools each week. Like Lifehacker, this blog dramatically increases my bag o'tricks. lo-fi has also started experimenting with Ubuntu Linux, so its interesting to compare notes.
  3. Catalogablog - Keeps a finger on the pulse of the cataloging world, which I think has enough wrong with it currently and enough going for it in the future, that I want to stay knowledgeable. The new info on MODS, METS, RDF and FRBR can be a little dry in technicalities, but tracking the headlines helps keep me current.
  4. Stephen's Lighthouse - I'm a Stephen Abram fan-boy, I'll admit it. I love seeing the man speak at SLA, and reading his column in Information Outlook is my favourite part of that publication. His blog is well written and thought-provoking.
  5. Slashdot - News for Nerds, Stuff that Matters. Almost too technical for me, sometimes, but often full of useful news (more specific to my world than Boing Boing, which I also like to follow).
I'm sure I'll have more to share next Aug. 31, but for now, that's it.

Wednesday, August 15, 2007

Kubuntu installed!

And my, how incredibly easy it was.

As you might remember from earlier posts, I decided that it would be a good idea to put a more recent and user-friendly version of Linux on Sarah's computer, Larkin. It had been running Debian, which while powerful, was more complex than we needed. Also, it had not been updated any time recently, and doing so might have lead to a video driver recompile (two words: Ewww wwww).

Research and recommendations from the computer's creator led me to seek Ubuntu. Its based on Debian, but makes more things automatic, so one doesn't have to know the proper Unix commands to do as many things. Its intent is to be simple, easy to install and use, and still uber-powerful. It comes in several flavours (regular, Xubuntu, Edubuntu and our choice Kubuntu), and has a regular release cycle of 6 months. The current version, 7.04 Feisty Fawn, is 4 months old (the number before the decimal is the year, the number after is the month). Dapper Drake, version 6.06, is available for long term support (3 years). I downloaded the disc images of both. Dapper Drake was tested out first on Berman, but since its almost time for 7.10 Gutsy Gibbon to come out, I figured Feisty was safe enough.

I burned the .iso files onto disc using the Burn Image option on Infra Recorder (this is in Windows; Mac has a utility for this kind of burn built in), then put the Feisty Fawn disk into Larkin. I had previously configured the BIOS to boot from disc before trying to boot from the harddrive, so Kubuntu started right up. The LiveCD I'd created allowed me to run Kubuntu from the disc first, to make sure it was compatible with my hardware. It was. All I had to do was click the install icon on the desktop, answer some simple locational questions, pick a username and password, and define my partition (since the whole computer was to be on this OS, I could just let the machine do it for me. Kubuntu does let users make it a secondary operating system, unlike Windows). That was that. I waited half an hour, then restarted, removed the LiveCD, and there I was, with a new Linux distro.

Configuring was easy. I simply needed to know what packages to install, then use Adept to do it. I added various bits of software for development, as well as some codecs, and the GIMP. More can be added at any time. Adept also looks for software interdependencies, and will get all the necessary programs to support what you requested.

Firefox came pre-installed, and I did the usual extensions. Setting up the printer and networked storage were just a matter of pointing the system to the right place.

The only challenges I had were getting the audio to work (which required mucking around with muted values in the Mixer, and plugging the speakers into the right port) and getting Java to download (I missed the 'do you agree' checkbox the first time).

So, Linux is easy. Anyone who wants to test it is welcome to borrow (or have) my LiveCDs for either Dapper or Feisty. I'm so pleased to have pulled this off without turning Larkin into a dead pile of metal (I wasn't worried, but I certainly considered what I'd do if so). I'm also glad that I'm personally free of Microsoft. When it comes time to replace my next computer, I won't be forced to buy Vista and all its evilness; Berman has shown me the way of the Mac, and Larkin can now be my guide to the world of Linux. I suppose this makes me a computer geek.


Friday, August 03, 2007

Welcome Berman, the new Mac in the household

Thanks to kind funding from my parents, we have just added a new laptop to the household network. Berman is named for the radical cataloger who challenged the racist, sexist, Christocentric and just plain arcane biases of LCSH. (We had though of going with Avram, which sounds a little cooler, but Sanford Berman is a bit more personally significant to the both of us).

Berman has been installed with:
  • Adobe Creative Suite 3
  • Parallels 3.0
  • MS Office 2004
  • Firefox (with Zotero, FireFTP, Twitbin, and Foxmarks extensions)
  • Adium (basically Pidgin for Mac)
  • Second Life
Its already talking quite plainly with our server, and the DVD player is clean and easy (the remote control really helps, too). All the installations, configurations and customizations have been incredibly easy. I'm still figuring out the Mac print interface, and making sure I've configured Berman to talk to the shared printer properly.

Even installing Kubuntu 6.06 (Dapper Drake) on Parallels was ridiculously simple. A twenty minute download, and a pointer, and I was set. I'm still working out how to save everything, but that shouldn't be a problem. I will read the book of a manual that comes with it when I have more time.

Still have yet to test the wireless connection, or to point iTunes to the music uploaded to the server. I will probably also install more programs as I need them, as well.

Very, very, very pleased. I take back most everything ill I may have said about Macs.

Friday, July 20, 2007

Reviewing Wink

I just used the screen-capturing software Wink to create a walkthrough of my LSC 597 Digital Library project. Since I was unable to attend the final class in person (wedding in Florida), I thought I could try out this tool and see how well it substitutes for an in-person demonstration. The price couldn't be better (free), and the export format is the nearly universal .swf (Flash).

The interface is very straight forward, with a main pane for the current frame, a set of thumbnails of all frames under that, and the local resources for each frame on the side.

You typically start by using the screen-capture wizard to either grab still shots of you whole screen, a window, or a shape. You can also set it up to record automatically, at whatever frame-rate you like, or to take a snapshot at every mouse or keyboard event. You can record audio as you go, or do it later when you're working with the frames you've taken.

One feature I didn't notice immediately is that Wink captures the frame and the cursor location separately. This means that you can move the cursor around (and change its type) to wherever you want on the frame. The cursor will move automatically as you change frames, going from the previous frame's location to the current frame's. I found that I could use this feature to save from recording continuously in order to capture mouse movement; I just needed one shot of the frame, and then I could move the cursor from link to link.

Adding audio is easy; just click the 'add audio' button on the local resources, and record. You can do it as many times as necessary to get it right. The exported presentation will automatically remain on the audio-enhanced frame as long as it takes to play the whole recording; you don't have to manually add 'stay on this frame for' (if you do, it will wait that much longer after the recording finishes...).

You can also add buttons, links, text boxes, shapes, and internal jump-to points. I didn't find any of these useful for my presentation (though I did use a text box to white out something), but they are good features to note.

The export process for my several-minutes walkthrough took about a minute and a half. The resulting file was about 5 MB. Wink automatically produces an HTML file with the .swf embedded, so you're ready to load it onto the web.

All in all, I was very pleased with Wink. If I need to do more online tutorials, I think the extra experience using it will make me like it that much more. I would need to invest in a better microphone and recording space, though. I look forward to a chance to compare it to the commercial software Captivate.

Oh, you can view my website tour here. Keep in mind, the intended audience is my LSC 597 class.

Thursday, July 19, 2007

Website Redesigned!

At long last, I've had the opportunity to redesign my website. The chance came with my enrollment in LSC 597: Digital Libraries. The primary feature of this new site, aside from its new address, is that it is a database-driven dynamic site, rather than flat HTML. I'm hosting it myself, on the SquareOne. You are encouraged to check it out.

The Photos section is currently up and running, though you have to be a local (i.e. in my apartment) user to see the full sized images. This will change when I build the user login system.

Keep in mind that advanced search and Documents searching is as yet unimplemented. The collection is small enough right now that that's not too big a deal. When the collection grows, so too will advanced searching.

There is a lot of other stuff on the list to do; I know there are several broken links, and the CSS I'm using is horribly boring. One of the nice features that will come with the user login is the ability to choose amongst several CSS, and have the system remember your choice (without those pesky cookies!). Again, coming soon.

I welcome comments and feedback.

Thursday, July 05, 2007

Digital Libraries presentation using Spresent

For my LSC 597: Digital Libraries class, I am required to do a presentation on digital library technology. Rather than use Powerpoint, or cobble together a website, I thought I'd try out Spresent, a tool I discovered via Librarian in Black. It's Flash based, which has intrigued me since SLA 2006.

The first thing I noticed was that, like most Web 2.0 tools, Spresent is still under development. They have lots of plans, including being able to export the presentation as a .swf, and being able to upload photos from your harddrive, but at this time, neither works.

Text can be either a headline, a text block, or a list. Each of those types comes with possible animations and other options.

There are several good backgrounds to choose from, and store of generic clipart. The real photographic power comes from the Browse Flickr button, where you can search for tags (with respect paid to copyright and Creative Commons). You can also specify the location of a JPG, YouTube video or Flash animation to embed in the presentation. Downside: only JPGs work, not GIFs or PNGs.

Slides are easy to create, copy and modify, and you build your slide show linearly or not-linearly, using buttons to jump from slide to slide. Internal links update automatically when you insert or remove a slide.

It wouldn't be Flash if it didn't play like a movie, and you can set your presentation to run automatically, with audio in the background and everything. I chose not to use this for my presentation, since I didn't know how long I'd spend on each slide topic, but for stand-alone web presentations, this is good stuff.

One thing I didn't like: you can't make text or image hyperlinks. I wanted to use the icons from the various companies I research as links in my bibliography, but (in addition to just looking cluttered), I couldn't make an image into a link. Nor could I put a link into text, or really do any kind of formating to the text, actually (no bold, no italics, nothing).

All in all, I think I found a good alternative to Powerpoint, though somewhat limited in what it can do right now. Hopefully as more people give Spresent a try, and are impressed, they will continue to upgrade and enrich the application. Give it a go if you have a presentation you need to give, and have a little extra time to play around with a new tool.

By the way, if you'd like to see what I created for Digital Libraries, you can go to the following link:

Friday, June 22, 2007

Mastering PHP's XML Parser

Probably not the most exciting subject for anyone else, but I've spent a lot of hours in the last couple days writing a script to automatically read ATOM feeds into a webpage. And yes, I'm aware that such things already exist, and can be found easily on Sourceforge, but doing it myself helped get me that much deeper into working with PHP, and made sure that I understood both the ATOM schema and PHP's XML Parser.

The XML Parser requires several functions to be put into it in order to do whatever it is you want it to do. You need a function to deal with start tags, one for end tags, and one for character data between those tags. The only place you can deal with attributes is within the start tag function. It helps, then, to have some global variables to store the information you desire to pluck out of an XML document. This sounded like something suited for an object-oriented approach, so I started by building a class called ATOMParser. I made the start, end and character functions internal, and created two public functions, one to parse a document, the other to set how many results to return and which label to search for.

Here entered a wrinkle; on my server, where I did much of my testing, I run PHP 4.something, whereas the university server has PHP 5.something. PHP 5 has a whole bunch of reserved words like public, private and interface, whereas PHP 4 does not. To make things work on both ends, I had to go generic with my terms. It all worked fine, but theoretically, someone could call the start or end functions from outside the ATOMParser object... for what little good it would do.

Another difference: on my server, I could simple open the XML straight away, whereas on the URI server it was necessary to stream the document in chunks, then open it. What does this mean? Well, using code that worked on my server to display all the different posts in the blog only showed the latest post (maybe the latest two, if I refreshed just right) on the URI server. The packeting fixed that.

So, my end result is a pretty portable piece of PHP that I can use elsewhere on the GSLIS site. You can see its current implementation at More will follow, notably the joblist. I will also likely use this somewhere in my website for bringing in blog info (same blog package as GSLIS). For other XML documents, I can now create custom parsers to do my bidding. Very exciting.

Well, to me, at least.

Thursday, June 07, 2007

SLA 2007 - Wrapup

SLA 2007 has come and gone, and I am certainly the better for it. I just wanted to take a moment and review my total experience here in Denver.

Technical skills:
I am now more than proficient with wikis and podcasts. Not only to I know a fair amount about setting them up, but I also have ideas on what kind of content to use with them. I'm more confident with my abilities, both at the IT level, and the CS level. Once again, everyone I talk to about my dual degrees says I'm going to have my pick of jobs.

I've picked up some new hints on great tools to use in the next year. Google Custom Search Engine will get used as soon as I'm back at work for the GSLIS. Until I can get MediaWiki installed on my server, Sarah and I might use PBwiki for planning our wedding. The screen-to-flash tutorial creator Wink I think might already be handy for a colleague, and I hope for me soon, as well. Just talking to folks, I got turned on to a Firefox extention that lets me to browser screen captures quickly and easily. And, I can't forget all the wonderful Web 2.0 search tools presented by Mary Ellen Bates.

I've always cared about the environment, but Al Gore's speech was that much more energizing to really do something about it. The free pedometer's from Thompson and the Social Science Division added that much more encouragement to use my own leg power to get around.

I'm also more caught up on the latest issues surrounding copyright and collaboration. I have a better idea what I need to do when I do any kind of digitization. I'm also more educated on Creative Commons, and how to utilize it for better information dissemination.

I have met so many wonderful people here, I doubt I could list them all even if I had my stack of recieved business cards in front of me. I can group them into several large groups, and thank them that way for their contribution to my conference experience.
  • Rhode Island chapter (namely Bill Anger and Jane Loescher)
  • Boston chapter (particularly Dave Ware, who has been phenomenal to me this whole conference)
  • Kentucky chapter (Leoma, James, Stacey, Alex, Valerie, Liz, and all the rest. They were incredibly nice to me, and let me crash their dinner and drinks time last night)
  • IT Division (which has quite the overlap with the above mentioned chapters)
  • The SLA Bloggers (you're all going into my reader!)
  • The vendors and instructors from all the myriad companies who sponsored SLA 2007
  • All the wonderful folks I've met in classes, at parties and in passing.
Please don't feel left out if I didn't mention you specifically, since I'm kind of pressed for time here in this coffee shop, and don't have all my info in front of me.

Division Service:
Hanging out with the IT division has led me to fall into the position of possibly being Chair Elect of something this coming year. Since this is one of the things I mentioned specifically in my stipend-winning essay that I'd like to do, I feel that much more successful about my journey out here.

What I'll do differently in Seattle:
  • Bring business cards. I should actually have an official affiliation by then, so this should be easy enough to do.
  • Not be moving the same week as the conference. This is stressful for everyone.
  • Bring a PDA or smaller laptop. This old Inspiron is a shoulder-killer.
  • Sign up as an SLA Blogger. Official access to the press room would be really nice.
  • Plan more relax time. I ran my self pretty ragged this conference, and while I got a lot out of doing it, I think I'd prefer to take some more evenings off (at least partially) next time.
  • Attend with Sarah! I'm really hoping she'll decide SLA is right for her, and that she'll be able to get away for the conference dates. Her presence would make the conference truly perfect.

Well, time to wrap this up, and start moving towards the airport.

Wednesday, June 06, 2007

SLA 2007 - Day 4 (Wednesday)

We start off with our closing speaker, Scott Adams, creator of Dilbert. Very good presenter, intelligent, humourous (very well timed), and all in all just plain delightful. Other SLA members have reviewed Scott's presentation. Click here or here for more in-depth opinions.

New Technologies in Instruction and Training Poster Session
I stopped by this briefly, and picked up several really good tips, tools and ideas. Below are what I found to be the gems.
  • Drexel University Libraries have created a "subway map" of the many avenues of getting users content. The "stations" on the different "lines" show how complex the information delivery world is. They have even created "schedules" for various "routes" (LC schedules). Its all quite innovative.
  • A tool called Wink allows you to do screen captures, or import from image, then add audio, explanation boxes, buttons, etc. and export to Macromedia Flash (.swf). Sounds like a wonderful way to create cross-platform tutorials for various programs your library or institution might be using.
  • Tim Tierney at URI was talking about these pens can digitally read what you write on paper. Well, I ran across that tool in this session. Its called Tegrity.
  • I got a name of someone at Yale libraries that's doing podcasting. Since I'm moving a few blocks away from Yale, and my upstairs neighbour-to-be is a librarian there, I figured this was a good connection to try to make.

Podcasting the Librarian Way
by the IT division, with speakers Tammy Allgood and Debbie MacLeod
This course provided an introduction to some of the possibilities of what podcasting can do for a library or information center. Tammy Allgood, of Arizona State University, showed us what she was able to do at her library for less than $300 of equipment. Debbie MacLeod, of the Colorado Talking Book Library, presented some of what her institution was doing to bring literary content to the textually impaired. Below are a list of podcasts that librarians have put together.

SLA Tech Zone: Podcasting – Make Noise the New Fashioned Way
presented by Thomas Dopko of Dow Jones
Another wonderful presentation by Tom. This course covered the technical requirements for getting a podcast up and running. That is, what do you have to do to be a podcaster. I won't go to far in depth, since you can view the slides of the presentation until Dec 31. I will, however, summarize how incredibly easy it is to podcast. You need the following:
  1. Content - find something you want to say, either in audio or video form
  2. Record that content in a standard form - .mp3 for audio, .mov for video
  3. Host the content - put it on a server
  4. Get or make an RSS feed for that content - you can use a free service like Switchpod, or write your own XML feed
  5. Share your feed address with people - let them know where they can find your content
  6. As a user, have a feed reader for podcasts (iTunes works well, so does Juice), and put that address into it.
  7. YOU'RE DONE!!!!
I'm looking forward to creating and hosting some of my own podcasts soon. I already have the tools, the software, and server to get everything up and running. I would just need to upload the videos and mp3s to the Apache part of the server, write an RSS feed document, and then point people to it all. Couldn't be easier.

Tuesday, June 05, 2007

SLA 2007 - Day 3 (Tuesday)

More courses today. Fortunately, several of them appear on other blogs, so I can save some typing time.

SLA Hot Topic - Collaboration vs. Copy Protection
A panel discussion with prominent folks from several areas in the copyright battle. This session looked to present all the possible viewpoints for what can be done with copy protection in the 21st century.

First was Stephen Abrams, who brought up the inconsistency of treating creative works the same as scholarly works. Do authors really make continued discoveries 70 years after their deaths?

Next, moderator Victor Camlek talked about the need for continued copy protection in order not to disrupt the current business model. Users and publishers can sometime come into contention, and something needs to be worked out to keep information flowing, while still "following the money"

Bill Burger from Copyright Clearance Center talked about the difference between the desire for financial protection, and the need for recognition. Different kinds of writers have different goals, and CCC has several products out there now that help users figure out exactly what permissions they have with a specific piece of material. The goal of CCC is to make it easy to do the right thing.

Crystal Megaridis of Praxair Inc talked about her company, and its need for both correct and complete content. These can be at odds, since one can't necessarily trust things that come from outside one's institution. Yet, that is where most information exists.

Finally, Thinh Nguyen of Science Commons spoke of a parallel world where the "amber of copyright was softened", and science was able to lead to cures and discoveries that many years sooner. The cost we face in living in the world we do is invisible to us, since we cannot see what might have happened. By using the semantic web, he hopes we can loose the facts from publications, distributing them as quickly as technology allows, while still providing protection for the creative part of the work.

In the Q & A, some asked if Fair Use was dead. No, definitely not. Google, for one, is banking on it with its digitization efforts. Nguyen quipped that Fair Use was a 'license to hire a lawyer'. Its a primarily American concept, Abrams points out, and that's why the music genome service Pandora is only available in the US.

Is there hope? Of course. By working with publishers and copyright holders, and by using new protection measures like Creative Commons, we have the potential for keeping the vital information flowing, while still protecting the economic investment of those involved in creating a work. As Abrams pointed out, copyright isn't a restriction; its just a need to ask permission. Often times, you'll get it.

20 + Tips for Searching the NEW Web
presented by Mary Ellen Bates
Thankfully, J of J's Scratchpad already did this one for me. You can read it here. I highly recommend that you do.

SLA Bloggers get-together at the Rialto Cafe
A wonderful chance for me to get to meet some of the people I'm subscribed to, as well as help figure out how to provide even better conference coverage next year. This should be appearing on the SLA blog shortly, and I'll edit the post to link into that once it comes up.

EDIT on 5-6-07: Here is the link to the post on the SLA Blog. Good photos for all the years. I hope to show my smiling face in this pic-series from now on.

Oh, man, am I tired. I think once INFO-EXPO wraps up, I'm going to call it an evening (until the Gold Digger's Ball). I could use a little feet-up time. Perhaps dinner with a high-school friend of mine who's in the area...

SLA 2007 - Google

Now that I've had some rest (not enough, surely), and I have a little time before my conference day begins, I can do my post on Google at SLA.

Honestly, I'm a little disappointed. Google's presence here is remarkably minimal. They don't have much information at their booth, just some tips for using some of their latest products. I think that Google, a first-time exhibitor at SLA, is trying to get synchronized with the librarian community, feeling us out while we do the same to them. This results in lots of really basic material being presented. We got to hear about Custom Search Engine, which I will put into practice on the URI GSLIS site when I get home. The rest, the tips and tricks, Book Search, Patent Search and News Archive, were presented as an overview only. I was more curious about new programs, things that have yet to launch, but Google won't speak to any of these things (as an aside, yesterday afternoon they just launched a new Trends feature in Google Reader, which gives you stats on what you've read, from whom, and what you did with those posts).

When discussing Book Search, I asked if they kept the Subject Headings from records of the books they digitized. Rather than a yes or no, the young lady said they kept everything in the libraries original records. Presumably this includes Subject Headings.

As noted in an SLA Blog post, no, Google does not have a library. ::jeer:: But they do have fortune cookies, which are quiet delicious.

  • Blue: Blueberry - "Tip: use the search box like a calculator What to type: 4+7, 30% of 55, 20^2 or 13cm in feet What you'll get: the answer"
  • Red: Strawberry - "Tip: exclude results you don't want What to type: bass -fishing What you'll get: results for bass the musical instrument, not the fish"
  • Yellow: Lemon - "Tip: get the weather forecast instantly on your cell phone Send 'weather 90210' to 446453 (Google) What you'll get: the weather forecast for 90210"
  • Green: Mint - same tip as Blue

Monday, June 04, 2007

SLA 2007 - Day 2 (Monday)

Man, am I beat! I got a late start today, but still, I spent so much time running from event to event this afternoon, evening and night, I'm about ready to ZZZZ out right here in the hotel lobby. I will attempt to blog first.

INFO-EXPO (Round 2)
I took a second spin around the expo floor, gathered some more swag, talked to some more vendors (got a few ideas and resources for the thesis), got my passport stamped, ran into someone I met last year at the Newbies meeting, and picked up an invite to the Thompson party.

Google Presents Tales, Tips and Tricks for Librarians
As I write this section, I realize that its just too huge an information dump for me to handle writing tonight, in a single post. I will defer this until later. Until then, you can read Emma Wood's post on the SLA Blog.

Tour of Flying Dog brewery
As I discovered in my CE class on Sunday, Flying Dog brewery offers tours at 4PM on weekdays. This directly conflicted with the SLA presentation Science of Beer, but I think I made a good choice; still tons of beer information, as well as some on the whiskey company next door, and samples for all! I went with Jeannie Bail, the Library Director at Allen & Company LLC, who was in my wikis course. I got to try two Belgian whites, a Barley Wine, and some fine Colorado whiskey. Despite the long walk and being a little late, it was definitely a great time!

Oh, and Flying Dog does have its own Beer Library.

Its neighbour, Stranahan’s Colorado Whiskey, has this lovely, custom made copper still. Just had to take a photo.

Thompson Party, SLA Boston dinner and IT Division Science Fiction Writers night
All of these were done in whirlwind succession. The Thompson party was at the Natural History Museum, so I got to eat absolutely delicious food while surrounded by wildlife dioramas. I had to leave very, very quickly to get to the SLA Boston dinner, which left from the Hyatt and went to Tamayo on 14th and Lawrence. Good food, good company. After that, I walked over to the IT Divisions event event. I tried to put in as best a presence as I could, but I was so tired, I had to call it a night.

That essentially brings us to now. Overall, I seem to be losing a lot of things today: my pedometer (got it replaced, and I've almost made up the 6 miles on the old one), my hotel room key (must have dropped it taking out my wallet somewhere...), and my internet connection (I'm just not having tech luck this trip). Hopefully I won't be losing sleep tonight; another long day tomorrow, as well as doing that delayed Google post.

SLA 2007 - Day 1 (the rest of the day)

My first full day at SLA 2007 was packed to the gills. I've already mentioned the amazing 4 hour CE course on wikis. This post is dedicated to the rest of the day.

Before all of this, I grabbed my pedometer. I plan on walking a lot, so why not have the chance to cash in on it? Plus, its a nifty little gadget for back home.

It was necessary to procure sustainance after such a long stretch of learning, so I hit up the 16th St. Mall, and stopped in at Marlowe's (16th and Glenarm). Delightful service (my waiter's name was Adam), and wonderful beer (a local brewery produces an exclusive line, the Big Nose series, for Marlowe's; try the wheat beer!). They specialize in seafood and beef, and do it very well. I hope to go back.

I headed back to the INFO-EXPO after lunch. I cruised the whole place, talking to a lot of vendors and picking up free stuff. I learned a lot about the current state of library technology, publications and services. Had I a copy of An Inconvenient Truth, I probably still wouldn't have waited in the Al Gore book signing line (it was very, very, very long, but I hear it was pretty quick). Ran into Bill Anger, Jane Loescher, Tony Stankus and Lee Peterson from SLA:RI.

Al Gore:
Several of my colleagues have already blogged about Mr. Gore's speech, so I won't repeat too much. It was awesome. Very passionate. And while it would be nice to have such a great guy in the White House, I do agree that he will much more effective as a 'rock star' raising the public awareness. Keep the pressure on, Mr. Gore! If applause are any indication, SLA is behind you.

Some links to issues loosely related to content in the speech, as I found THIS VERY MINUTE on Slashdot:

This is the point in time when I posted my last blog... So much lag. I miss my PDA!

IT Division's Welcome to Denver Open House (Analog Game night):
Man, do IT librarians know how to party! This event was amazing, with lots of fun card and board games, good food, an open bar and lots of good ol'fashioned networking. I finally got to meet Dave Ware, the fellow from SLA Boston who's been helping me set up the finances to come here to Denver. I also got to meet Jill Hurst-Wahl and Tracy Z. Maleeff (Library Sherpa), as well as see Thomas Dopko from Dow-Jones again (he taught some of my Tech Zone courses last year). I watched a group of librarians from the University of Virginia play cribbage (I chipped in my expertise as needed). I eventually wound up playing Texas Hold'em with Tracy, Dave, and a strong contingent of librarians from Kentucky. I took a class on poker back in my undergrad, but I never seem to get enough practice to get really good. Oddly enough, in the last hand of the evening, when everyone went all in, I won the pot (with something piddly like a Queen high). The pot came with a martini shaker, and there is a photo out there somewhere... I must track it down.

I tried to catch up on my sleep last night, and now, with a venti mocha and an apple fritter in my system, I'm ready to tackle the conference once more. Today, I look forward to Google's Tips and Tricks presentation, a tour of Flying Dog brewery, dinner with SLA Boston, and another wonderful IT Division evening.

Sunday, June 03, 2007

SLA 2007 - Day 1 (Sunday) - CE Course

8:00 - 12:00 : CLICK U LIVE! Organizing your content and collaborating on the web
By Karen Huffman and Cassandra Shieh

This CE workshop focused on how to use wikis to enhance your information collaboration project. We started out with a brief introduction, where everyone stated their names, organizations and favourite places to travel. We then dove immediately into an overview of what wikis are, what they do well, and several examples of prominent wikis, both engines and implementations.

So, why use a wiki? If you have a situation where you have multiple people developing some kind of informational project, be it a manual, a reference material or an event planner, and you'd like to avoid multiple versions of documents being emailed back and forth and back and forth, a wiki is for you. It essentially lets you bring everyone into a common workspace, tracking changes and letting you connect pieces of information dynamically.

How is it different from a blog? Blogs are a single voice, even if they are made up of multiple contributors. That voice is presented chronologically, in weekly, daily, hourly or minutely posts. A wiki, however, is not chronological. True, one can track the changes made over time, but a wiki has a far more complex structure. Its pages, rather than being broken down by units of time, are broken down by units of content. Both tools have their place, and can easily be made to work together.

Can't anybody edit a wiki, thus calling the information therein into question? This is the case only if you want it to be. You can choose, on most wiki engines, what level of permission each kind of user has. The casual web browser can be a read-only viewer, and editors can be assigned by your department. You decide who gets what rights to do what when you set up the wiki. You can make it a free public utility like Wikipedia, or put it on your intranet. Its all up to you.

After the overview, we talked a little about 5 major wiki engines:
  • Confluence - This is properitary software, with a price-tag. This is the wiki package SLA bought.
  • MediaWiki - The software behind Wikipedia. This is what we learned the most about.
  • PBwiki - a hosted solution that quick and easy
  • Socialtext
  • Wetpaint - free, hosted, but ad-supported, and with no user permissions
  • Other can be found and compared at Wiki Matrix

We dove into MediaWiki, working in a sandbox created by Karen (who won a well-deserved SLA award later in the day, btw). This hands on experience was ABSOLUTELY WONDERFUL, and exactly what I needed from this course. We created our own pages, used templates to save on repeating content, added extensions, uploaded media files, created tables of contents, and much more. I'd give you the example URL, but its password protected right now. When I get the new disk image for the SquareOne, which will hopefully support MediaWiki, I'm installing it post haste.

We intended to go into Wetpaint next, but ran out of time. I will hopefully have a chance to play with it in the near future.

This has to be the most useful course I've ever taken. Thank you, Karen and Cassandra for a wonderful session. I look forward to using wikis in my future projects (I'm already thinking PBwiki for organizing my wedding).

Saturday, June 02, 2007

Arrival in Denver

After a lot of flying and bus riding, I made it to the hotel in Denver. Nice place. The 16th St. Mall has lots of interesting shops, and free public wifi, which I'll get into in just a moment.

I've been craving a free internet connection since PVD, so that I can configure my laptop to make my time here at the conference easier. Here's what I need to do:
  1. Upgrade Firefox to 2.0.x
  2. Install the necessary extensions
    • FireFTP (to access my server)
    • Twitbin (to keep on a Twitterin')
    • Zotero (an absolutely amazing bibliographic management tool. I could rant. I might later).
  3. Configure my quick bookmarks bar:
    • Conference Wiki
    • All my Google apps, like Gmail, Reader and Blogger
    • Homepage for my Digital Libraries class (I did some reading summaries on the plane)
  4. Make iGoogle my homepage (so I can get a quick snapshot of what's going on)
I'd have done this upon arrival to the hotel, but my computer doesn't seem to have a working Ethernet port anymore. And wifi doesn't get up to my floor. So, here I sit in one of the bars on the street level, typing away, wishing I could be doing it at a desk in my room. This laptop is wicked heavy! Oh, well, we do what we must. It's not like I won't be otherwise incredibly busy starting tomorrow.

Oh, and since I was bumped to a different hotel because of construction, my credit card payment information has been lost in the struggle. This means a $700 hold on my debit card until we can get things squared away financially. I know SLA Boston will make everything work out alright, but I am moving right after I get home, and there is still that pesky lease to sign....

And as an aside, who should be riding the same bus as me to the same hotel, but Tony Stankus! This harkens back to last year, when I thought, "Gee, I wish I'd run into Tony", and BAM! there he walked by. With any luck we can get together for a drink or a meal, but we're both running around like crazy this year. At least I got to see him.

Blog title change

I've never been very good with titles. I've always liked working in the math and science realm, because when you wrote a paper, all you had to do was describe all its key points in the title, and you were set. Coming up with catchy titles, like one must in the arts, social sciences and humanities, is just not a talent of mine. So, I enlisted the help of my wife-to-be, Sarah, to help me come up with a better blog title. You'll notice it above.

In other news, I got an email back from Quad Microworks, and they will be offering a downloadable disk image with the "highest version numbers that the original hardware can handle without slowing down too much" of APM for the 1st gen system. It should be out after the 2nd gen systems ship.

Friday, June 01, 2007

It figures...

Before I can even get the thing fully up and running, they go and release a second generation device...

I doubt they'll have a trade-in program... and after the hassle of moving so many gigabytes of info onto the current system, I'm not sure the doubled CPU and RAM would make up for it. With any luck, they'll provide us first-gen'ers a way to upgrade our MySQL and PHP versions. I've asked.

Getting ready for SLA 2007

I'm currently printing out some articles for my Digital Libraries class, so that I might have productive things to read on the flight to Denver. Unfortunately, many of these documents were born digital, and it sucks up a lot of paper to print them (one of the assigned readings is actually a book, only 47 pages, but still). I'd plan to do other work on the plane, but my laptop is now completely battery-dead, and they don't typically provide wall sockets on a 777. I suppose I could bring a book and ::gasp:: relax a little...

Since my PDA was thoroughly destroyed last winter, and my phone's internet connection is both slow and expensive, I'm also going to make a printed packet of relevant information about my stay in Denver, including a map and calendar. Not nearly as high-tech as I'd like, but I'm pressed for time.

Oh, and I finally figured out why the Dynamic DNS services I'd tried for the SquareOne haven't worked; its my ISP. Cox blocks port 80 incoming, so you can't run any kind of web server. They do not block Telnet or FTP access, so I will be least be able to access my files remotely, but that's why the website hasn't migrated yet. Once we move, we will change ISP to one that doesn't explicitly block port 80. So long as I don't use my site for commercial purposes, I should be fine within this other ISP's service agreement.

Interesting note on ISPs: You aren't allowed to use any kind of Linux with their services. Not that there is any technical reason why you couldn't (the SquareOne runs Linux, and here I type), but they just don't want to train their people to deal with it. Not that the support people I've encountered when calling an ISP seem trained at all, but that's one of those rants I shall save for a different blog.

Monday, May 14, 2007

Setting up the Square One, part 1.5

It occurs to me that I should really include a picture of the Square One in its native habitat.

Its being safeguarded by a noble Cylon Centurian. Thanks, Peter!

In other news, I've decided, after talking to the creator of the household Linux system, that it would be better to just install a new distro altogether, one I can master and keep up to date. We've decided to go with Kubuntu, and will start the process once all the documents on the Linux system are safely backed up on the Square One.

Sunday, May 13, 2007

Setting up the Square One, part 1

As I mentioned in an earlier post, I got a Quad Microworks SquareOne personal Internet server for my birthday. Now that classes are over, I felt I had the free time to set it up, and our household need for constant Internet service was low enough to risk losing it for a few hours.

No such loss occurred, thankfully. Step 1 in my setup, the actual wiring of the SquareOne as an Internet router went as smooth as silk, since the computers in the house were already running off a router. Great!

Step 2: Accessing the SquareOne, and mapping it's shared folder as a remote network drive.

On Windows, this was simply a matter of going to the IP address of the SquareOne, entering the password, and clicking "map network drive". I now have access to the 320 GB harddrive, as well as all the pre-installed programs, all through my convenient Q: drive.

On Linux, I had figured this would be a little trickier, but the computer was way ahead of me, and the two machines were already talking through Samba. All I had to do was navigate to the shared space though Konquerer, enter the password, and I was all set. I could even play the test file (a Horrorpops mp3) through Amarok. Since one of the goals of getting the server was using it as a space to store our music so that any system on the network could access it, this pleased me greatly.

Step 3: Setting up the networked printer.

I have a HP Deskjet, and until now, only the Windows computer had access to it. Major hassle, since most of the print jobs originated from work done on the Linux system.

On Windows, all I had to do was connect the printer to the USB port on the SquareOne, then follow the instructions in the SquareOne manual. The trickiest thing that I had to do a soft reset on the SquareOne to get everything started.

On Linux, though, I hit my first significant snag. The particular system I'm running is Debian, and woefully out of date. The Common Unix Printing System was not installed on this system, since it had never had a printer before. Attempting to do so led to all kinds of package update, installation, and removal issues, and lots of warnings about unauthorized sources being a security risk. Talking to the computer's creator, I learned that this system hadn't had any package maintenance in a while, and making all the upgrades would probably be a huge, huge hassle, and involve a lot of risk (like my not being able to get the video driver working again...).

Therefore, rather than potentially take out one of our desktop systems, I decided that waiting a little while, then backing up all the data onto the server and installing a new OS would be the best course of action. I've had Ubuntu and Fedora Core 6 recommended.

In this process, I've learned a lot more of what's floating around pre-installed on the Linux system, and how to work with it. So, despite not being able to print yet, the whole experience has been worthwhile.

Step 4: Setting up the SquareOne as a server to host my website, and allow for FTP access.

I've progressed a bit on this, but still have some tests to run before I report on it. Soon!

Monday, April 23, 2007

Addressing the lack of recent updates...

This semester, like most of them, is ridiculously busy. Hence, I haven't updated this blog on what's going on in my professional development in quite a while. I don't have much time right now to do so, but I'll outline a list of things for me to address, and hopefully once the semester ends, I can start filling out the details.
  • The GSLIS website is almost ready to go live; just need to do a few more adjustments, and get faculty approval.
  • I've been handed the Webmaster-ship of the Rhode Island chapter of SLA.
  • My CSC 305 course in Software Engineering has proven to be a bit of a revolution in my way of thinking about website projects.
  • My CSC 536 course in XML databases has started to produce fruit (namely, viable XQueries on the initial version of my household catalogue).
  • I got the new Square One Personal Internet Server, from Quad Microworks, for my birthday, and will soon be hosting my own website.
  • I'm going to SLA Annual in Denver, and Boston Chapter of SLA is financing it (I won their annual travel stipend!)
  • My thesis topic has pretty much settled itself: the organization of music, at the song level, using XML, Native XML databases, and user-generated content.
  • I've been having visions of a new method to use in library school to not only teach students the core material and the latest technologies, but to also develop a powerful resource for the professional community. The feasibility is not all that high right now, but if I laid out the details clearly... just maybe...
There is undoubted more than that, but I'll try to catch up on those things first. Only one week of classes left (until summer starts, that is).

Wednesday, January 24, 2007


I just got my letter of acceptance to the Computer Science department, and not a moment too soon. I had tried to change my classes earlier this month, but a Hold had been placed on my account. It demanded that, as a new student, I accept admissions before enrolling in any classes. Problem was, the offer of admission hadn't made it into the system yet, and I couldn't. Fortunately, Sue Ryan at the grad school was able to clear everything up with a few clicks.

So, now I'm taking the following:

CSC 305 - Software Engineering: This course is one of the ones I had hoped to challenge or drop, since none of my graduate courses have it as a pre-requisite, and I don't plan on doing any software development in my Information Science career. However, now that I've started reading the text, I can see where the concepts could be applied to web design projects and other information systems. This may prove very valuable for building my team to help improve cataloging.

CSC 412 - Operating Systems and Networks: Apparently, I'll be implementing my own OS in this course... scary, but it should have me well prepared to make the switch to Linux, and to work with Unix webservers. Plus, more group work, so I have another place to apply my 305 knowledge.

CSC 536 - Special Topics in Database Management Systems: This is actually an independent study with just me and Dr. Peckham. I'm hoping to cover XML as a database language, learning more about Object Oriented database systems, as well as XML and all its related technologies (XML Schema, XSL, XPath, XQuery, XLink, XPointer, XForm, etc. etc.) I'd like to learn enough to be able to rebuild part of my website as XML.

CSC 591 - Computer Science Seminar Series: This is required for graduation, and is only for 1 credit. I'm not sure at all what the topics covered will be, since there is no information about it anywhere on the CS website or in the course descriptions.

I haven't actually had any of these classes yet; one professor is injured, the other is in Central America. The seminar meets on Friday, and I have my first meeting with Dr. Peckham on Monday. So, for now, I'm mostly working on GA work (the GSLIS website is almost ready to be assembled). I have also been nominated as the SLA:RI webmaster; this was originally going to be a PFE, but with the 13 other credits, I had to drop it down to just being a professional activity. This frees me up some, but, of course, makes finding the time to do the job that much more difficult. I'll have longer to do it in, I reckon, but I still want to meet reasonable timetables.

In closing news, I was actually able to get all my books this semester through the library.