External Network Unification Part 1: Research and Development

"Systems Boy! Where are you?"

I realize posting has been lean, lo these past couple weeks. This seems to be a trend in the Mac-blog world. I think some of this has to do with the recent dearth of interesting Mac-related news. In my case, however, it's also due to a shocking lack of time brought on by my latest project.

You may or may not remember, or for that matter even care, about my ongoing Three Platforms, One Server series which deals with my efforts to unify network user authentication across Mac, Windows and Linux systems in my lab. Well, that project is all but done, except for the implementation, which we won't really get to do until the Summer, when students are sparse, time is plentiful, and love is in the air. (Sorry, but if you manage a student lab, you'll probably understand how I might romanticize the Summer months a bit.) Anyway, we've got our master plan for user authentication on our internal network pretty much down, so I've turned my attention to the external network, which is what I've been sweatily working on for the last two weeks.

Our external network (which, for the record, has only recently come under my purview) is made up of a number of servers and web apps to which our users have varying degrees of access. Currently it includes:

  1. A mail server
  2. A web host and file server
  3. A Quicktime Streaming Server
  4. A community site built on the Mambo CMS
  5. An online computer reservations system

In addition to these five systems, additional online resources are being proposed. The problem with the way all this works right now is that, as with our internal network, each of these servers and web apps relies on separate and distinct databases of users for its authentication. This is bad for a number of reasons:

  1. Creating users has to be done on five different systems for each user, which is far more time consuming and error prone than it should be
  2. Users cannot easily change their passwords across all systems
  3. The system is not in any way scalable because adding new web apps means adding new databases, which compounds the above problems
  4. Users often find this Byzantine system confusing and difficult to use, so they use it less and get less out of it

The goal here, obviously, is to unify our user database and thereby greatly simplify the operation, maintenance, usability and scalability of this system. There are a number of roadblocks and issues here that don't exist on the internal network:

  1. There are many more servers to unify
  2. Some of the web apps we use are MySQL/PHP implementations, which is technology I don't currently know well at all
  3. Security is a much bigger concern
  4. There is no one on staff, myself included (although I'm getting there), with a thorough global understanding of how this should be implemented, and these servers, databases and web apps are maintained and operated by many different people on staff, each with a different level of understanding of the problem
  5. All of these systems have been built piecemeal over the years by several different people, many of whom are no longer around, so we also don't completely understand quite how things are working now

All of these issues have led me down the path upon which I currently find myself. First and foremost, an overarching plan was needed. What I've decided on, so far, is this:

  1. The user database should be an LDAP server running some form of BSD, which should be able to host user info for our servers without too much trouble
  2. The web apps can employ whatever database system we want, so long as that system can get user information from LDAP; right now we're still thinking along the lines of MySQL and PHP, but really it doesn't matter as long as it can consult LDAP
  3. Non-user data (i.e. computers or equipment, for instance) can be held in MySQL (or other) databases; our LDAP server need only be responsible for user data

That's the general plan. An LDAP server for hosting user data, and a set of web apps that rely on MySQL (or other) databases for web app-specific data, with the stipulation that these web apps must be able to use LDAP authentication. This, to me, sounds like it should scale quite well: Want to add a new web app? Fine. You can either add to the current MySQL database, or if necessary, build another database, so long as it can get user data from LDAP, as user data is always redundant and should always be consistent. It's important to remember that the real Holy Grail here is the LDAP connection. If we can crack that nut (and we have, to some extent) we're halfway home.

This plan is a good first step toward figuring out what we need to do in order to move forward with this in any kind of meaningful way. As I mentioned, one of the hurdles here is the fact that this whole thing involves a number of different staff members with various talents and skill sets, so I now at least have a clear, if general, map that I can give them, as well as a fairly clear picture in my mind of how this will ultimately be implemented. Coming up with a plan involved talking to a number of people, and trying out a bunch of things. Once I'd gathered enough information about who knew what and how I might best proceed, I started with what I knew, experimenting with a Mac OSX server and some web apps I downloaded from the 'net. But I quickly realized that this wasn't going to cut it. If I'm going to essentially be the manager for this project, it's incumbent upon me to have a much better understanding of the underlying technologies, in particular: MySQL, PHP, Apache and BSD, none of which I'd had any experience with before two weeks ago.

So, to better understand the server technology behind all this, I've gone and built a FreeBSD server. On it I've installed MySQL, PHP and OpenLDAP. I've configured it as a web server running a MySQL database with a PHP-based front-end, a web app called MRBS. It took me a week, but I got it running, and I learned an incredible amount. I have not set up the LDAP database on that machine as yet, however. Learning LDAP will be a project unto itself, I suspect. To speed up the process of better understanding MySQL and PHP (and foregoing learning LDAP for the time being), I also installed MRBS on a Tiger Server with a bunch of LDAP users in the Open Directory database. MRBS is capable of authenticating to LDAP, and there's a lovely article at AFP548 that was immensely helpful getting me started. After much trial and error I was able to get it to work. I now have a web application that keeps data accessed via PHP in a MySQL database, but that gets its user data from the LDAP database on the Tiger Server. I have a working model, and this is invaluable. For one, it gives me something concrete to show the other systems admins, something they can use as a foundation for this project, and a general guide for how things should be set up. For two, it gives us a good idea of how this all works, and something we can learn from and modify our own code with. A sort of Rosetta stone, if you will. And, finally, it proves that this whole undertaking is, indeed, quite possible.

So far, key things I've learned are:

  1. MySQL is a database (well, I knew that, but now I really know it)
  2. PHP is a scripting/programming language that can be used to access d atabases
  3. MySQL is not capable of accessing external authentication databases (like LDAP)
  4. PHP, however, does feature direct calls to LDAP, and can be used to authenticate to LDAP servers
  5. PHP will be the bridge between our MySQL-driven web apps and our LDAP user database

So that is, if you've been wondering, what I've been doing and thinking about and working on for the past two weeks. Whew! It's been a lot of challenging but rewarding work.

This is actually a much bigger, much harder project than our internal network unification. For one, I'm dealing with technologies with which I'm largely unfamiliar and about which I must educate myself. For two, there are concerns — like security in particular — which are much more important to consider on an external network. Thirdly, there are a great many more databases and servers that need to be unified. Fourth, scalability is a huge issue, so the planning must be spot on. And lastly, this is a team effort. I can't do this all myself. So a lot of coordination among a number of our admins is required. In addition to being a big technical challenge for me personally, this is a managerial challenge as well. So far it's going really well, and I'm very lucky to have the support of my superiors as well as excellent co-systems administrators to work with. This project will take some time. But I really think it will ultimately be a worthwhile endeavor that makes life better for our student body, faculty, systems admins and administrative staff alike.

Go Away or I Shall Replace You with a Very Small Shell Script

"Think about it: the computers work great until the users come in and start fucking everything up. I don't hate computers, I hate users."
-systemsboy

This is a rant borne out of the recent feeling that I hate everyone and wish they would all just leave me the fuck alone. I don't think there is a SysAdmin alive who has not felt this way at least once in his life. This one's for you.

Until recently I naively believed that as time progressed, and as new generations came up in the information age, a familiarity with computers would breed a more tech-saavy user. Everyone's always talking about how kids who grew up in this digital age of ours are so much better with computers than adults who had to learn about them as, well, adults. This is total shit. I fall into the latter camp — I started learning about computers in earnest in 1997 — whereas most of the students I teach are in the former — they grew up with computers in their schools and homes and have used them all their lives. And I've assumed, again naively, that I'd see progressively tech-saavy users with each successive class of students. The dream was that the students of the future would have far fewer technical problems, and be far more self-reliant when it came to troubleshooting said problems. Unfortunately, by and large, the reverse seems to be true: students seem more helpless — and at the same time more demanding — than ever before.

There's an episode of Star Trek involving an ancient race of beings who rely heavily on a technology which they've completely lost the ability to understand. This technology is, essentially, a computer with the ability and intelligence enough to run their planet — and itself — for centuries without human intervention. After generations of relying on this computer, the people forget how it works, and when it finally breaks down — see, they always break down eventually — it begins emitting such powerful radiation that it renders the population sterile. But they're at a loss as to how to go about fixing it. It's quite a bind. Ultimately, they end up kidnapping a bunch of kids before finally getting the Enterprise engineer to fix their system. Or, I should say, to show them how to fix it.

I think what may be going on today is similar. I think today's computer users are are like those people in the Star Trek episode. They've completely lost touch with a technology upon which they're reliant. The new generation of student is actually less tech-saavy because, rather than seeing the computer as a tool that he must learn and understand in order to use properly, he sees it as some sort of birthright and as something that should, as they say, "just work."

This is not completely the wrong attitude. Computers, to a certain extent, should "just work." But I think the whole reason we put quotations around that phrase is because we SysAdmins all know, deep down inside, that that idea is, to a certain extent, a pipe dream. And doubly so for the art student range of users who have a tendency to use computers in ways in which they were not originally intended. Computers are extremely complex mixtures of hardware and software interacting with users on behalf of their desires, needs and expectations. When they break or even fail to function in certain ways, I hardly find it surprising. It is, quite frankly, par for the course. It's the reason I have a job.

But these days people seem to think my job is to fix any and every computer problem that might occur, both in the lab and outside of it. Many users refuse to undertake any troubleshooting steps themselves and come immediately to me for help, when often a simple reboot will solve their problem. I actually had a student contemptuously ask me why he should have to reboot the computer, as if it were ridiculous that A) there was a problem in the first place, and B) he should have to do anything about it himself. There is a sense of entitlement and an intellectual laziness that seems pervasive lately among end-users. It's all I can do to get users to Google a question they have or check the help files before coming to me for help with a problem. Consequently, a far-too-sizable chunk of my time is spent answering questions to which the answers are readily available online or right there on the computer. Or worse, looking up the answers to those questions for users who are too lazy or arrogant to do it themselves. It's infuriating.

"Give a man a fish and you feed him for a day. Teach him how to fish and you feed him for a lifetime."
-Lao Tzu

When I was in school I spent a great deal of time and effort troubleshooting my personal computer, and thereby learning about computers and how they work. In the process I also learned how to find information I needed on a given topic or problem. We didn't have a class in this. I am completely self-taught. I taught myself how to fish. And I've made a career out of it. Where I work there is a class on systems. It's required. But many students object to this requirement and resent having to take this class. Not only do they refuse to learn to fish, they seem to expect someone else to catch, gut, cook, cut and hand-feed them the fish. And when it doesn't taste just so, they spit it back in your face.

I believe in understanding the tools of your craft. The great Renaissance painters understood the chemical interactions of pigments in oil. They knew how to mix primer and rabbit skin glue, and how to construct and stretch canvas. These days we have paint in tubes and pre-stretched canvas, but any painter worth his salt still has a fundamental understanding of the chemicals in paint and the best way to go about making a stretcher. Computer art students would do well to follow this model. And, quite frankly, they should do so happily. They should be in love with their tools. If they're not, maybe they should find another medium. 'Cause otherwise they're going to end up kidnapping children. And that just ain't right.

Scripts Part 5: New Spotlight Disabler

Someone recently commented that my script to disable Spotlight was no longer functioning in v. 10.4.5 of Tiger. When I went to check on the functionality of the old script, I realized I'd been working on a new and improved version awhile back, and that I'd intended to post it, but completely forgot to. So I went in and finished up this spiffy new version, and I'm posting it today for anyone who's interested, or for anyone for whom the previous version had stopped working.

This new version comes with the same disclaimers as the other one (which are now listed in the script itself), but gives you a few more options for disabling Spotlight. In particular, you can now choose to disable/enable Spotlight on either a single volume, or an all volumes. The script will also report the Spotlight status of all currently mounted volumes before asking you what you want to do.

Enjoy!

SpotlightEnableDisable Script
See the code

Notes on Live Quicktime Streaming

There are a few details to setting up a Quicktime Streaming Server for live broadcast that I always forget, particularly when it comes to authentication. I just went through the irritation of setting this up for the umpteenth time (it's an annual occasion), and thought I'd post some quick, pertinent notes on the process so that next year I can get the info here instead of scouring the 'net.

Quick Notes:

  • In order to do live streaming via Quicktime Broadcaster's Automatic Unicast there must be a user or group authorized to do so in the qtusers or qtgroups files located in /Library/QuickTimeStreaming/Config/
  • Create users with the qtpasswd command (qtpasswd -h for help; qtpasswd username to set the password)
  • Any user in the qtusers file can do live streaming to any directory in the Movies folder that has the properly configured qtaccess file
  • The qtaccess file looks like this:

    <limit WRITE>
    require user streamuser
    </limit>
    require any-user

    where "streamuser" is the name of the user who is authorized to stream

  • Finally, the live streaming directory must be owned and read-/write-able by the user qtss
  • Live streaming can then occur by simply pointing Quicktime Broadcaster to the live streaming directory on the specified server with the username and password of the user specified in the qtaccess file


Quicktime Broadcaster: Automatic Unicast Settings
(click for larger view)

That's it. Those are the sticking points for me every year. Hopefully, next year I can just check back here and get up and running in a few easy steps.

If not, here are some good links:
Webcast How-To
Authentication
Apple Developer

Yawn! Slowest News Week Ever

Despite yesterday's "fun" announcement from Apple, it's been a slow news week. In fact, the whole of February's been pretty dull.

Yesterday Apple unveiled one new and one revised product. The new product is Apple's iPod Hi-Fi, which is essentially a speaker set for your iPod. Now I'm not a big iPod fan. I just don't get the appeal of listening to music while walking around. Call me old fashioned. So the iPod Hi-Fi really leaves me understandably cold. And that new iPod leather case? If you listen closely you can hear my head hitting the desk as I pass out from boredom.

The revised product is only slightly less thrilling: a revamped Mac mini. The most hyped feature of the new machine is that it now sports a fast Intel processor — quite a step up from the G4 of the previous models. Apple's getting a lot of mileage out of this Intel switch, and they're being very smart about updating the slow, G4-based machines first. Personally, though, I'm more excited about the fact that the new Mac mini comes with gigabit ethernet. This makes it much more viable as a server, which is what I use my (well, the school's) Mac mini for currently. Other cool features: The high-end model boasts a faster DVD Burner that is capable of burning dual-layer discs, and memory can now be upgraded to 2GB. So, aside from the obvious processor enhancement, there is a lot to recommend the new Mac mini over it's predecessor. I will go out on a limb and say that the high-end model gives you a faster processor with an extra core, a larger hard drive, and a much better optical drive, so in my opinion, it's well worth the extra $200 clams. The only bummer about the Mac mini is the use of integrated Intel graphics, which uses free processor power and system memory to enhance graphics performance. (Not that this was a great graphics performer anyway.) The impact of this on overall system performance remains to be seen, but I tend to think it will be negligible.

Still, it's hard for me to get excited about any of this. Lately the rumors have been so much cooler than the reality. That fake video iPod might just have got me to finally cough up the cash for one, just for the cool factor alone. And the idea of a tablet or an ultra-portable gets me pretty hot as well. I'm definitely in the unimpressed column and have really been itching for a new Apple product for some reason. What we got yesterday just doesn't do it for me.

Man, I'm bored. I'm so bored I'm actually writing a post about things I'm not even very interested in. Now that's bored.