Thursday, September 30, 2010

Week 5 Comments

http://sarahwithtechnologyblog.blogspot.com/2010/09/week-5-comments.html?showComment=1285871883573#c8024663277691824988

http://nrampsblog.blogspot.com/2010/09/unit-5-reading-notes.html?showComment=1285874936608#c5262867092224111660

http://maj66.blogspot.com/2010/09/muddiest-point-week-four.html?showComment=1285875766828#c5070807601837688784

10/4 Reading Notes

Wiki Databases
We always uses to tell our students not to use Wikipedia for research assignments, because it was notoriously untrustworthy. The more I use it for this class, however, I'm being won over to its simplicity. I still don't believe that people are basically good, though.

It had not occurred to me that the World Wide Web is itself a database; it is perhaps the mother of all databases: pan- or meta of the family. You can see this clearly in the diverse applications which firms like Google have created to harness different aspects of the web. Interestingly, this means that anybody can insert themselves and their interests into a global database (in fact, we have been included without our consent, in most cases).

Getty Metadata
When I first thought about metadata I wondered if it wasn't just a semantic ploy--a newly created field to get published in. I've changed my mind since then.
In the library profession metadata must be the primary way we interact with information. It enables us to evaluate an object without necessarily consuming it. I like the notion that metadata continues to accumulate throughout the lifetime of an object.
I wonder if there is any danger in object becoming (at least to information professionals) nothing more than the sum of their metadata? At some point, can the forest obscure the trees?

Dublin Core
This discussion was more technical than I generally like in my light reading....
I think I understand the overarching concept: a need exists for universal metadata descriptors by which an object can be searched-for across disciplines.
Is that correct?
Apparently there is even funding for such an effort, which seems charitable of someone.
Is there a governing, authoritative body which will enforce this, after it is implemented? Otherwise I don't see how it will succeed. If so, will it be similar to Dewey? Discuss. (How pompous is it to conclude with "discuss?")

Muddiest Point 9/28

I'm confused by the Jing assignment, but I'll keep working on it....

If we continue to primarily listen to audio through compressed internet media, will our lowest common denominator for sound quality plummet? Or will technology advance to the point that we can no longer tell the difference?

Saturday, September 25, 2010

9/27 Comment

http://sek80.blogspot.com/2010/09/week-4-reading-notes.html?showComment=1285445406535#c4042747661202527015

9/27 Reading Notes

Data Compression (Wiki)
Essentially,
Data compression is a trade-off on some level. A user must sacrifice some information in order to transmit the whole in a smaller, simpler bundle. Lossy compression shaves off the edges to make the information fit better, while lossless compression changes the information to describe it in fewer bits.
From what I understand, another trade-off is financial: a user can opt to pay for specialized equipment or software that does the heavy lifting of compression without losing any information in the process.
Is the “buffering” process a side-effect of data compression?

DVD HQ
This was difficult to get through. I’ll be honest: it’s tough to care about the details of data compression. I would like to be able to compress and decompress information, so I suppose the back-story will be helpful at some point.
I’m impressed that so much goes into breaking down every bit of information that we transmit, and that it works so smoothly most of the time.
What I did take away from this is the complexity and intricacy of this applied technology. I fully appreciate the investment in programming that allows this to work. The idea of adding a difference image between video frames (and individual blocks within each frame) is a bit fascinating. It supplies a good example of problems and solutions that build upon one another as our understanding of the potentials and limitations of programming reveal themselves.

Imaging Pittsburgh
This is a great database. Every city should have something comparable, though there is a limited audience for such things. Whenever I browse collections like this I grow nostalgic for people and places I have partnership with.
I agree that one of the big challenges of the program will continue to be finding avenues for users to explore the collection. I think that a clickable city map would really be helpful. Interestingly enough (depending on your whatever), access to a social history—for any type of community—is one of the best ways to unify it and to mobilize it. This database could be valuable to promote community organization.

You Tube in the Library
I agree with the author, that You Tube could make library education easier for librarians. Hosting a library education or orientation channel would also put the onus on students, rather than libraries, to demonstrate the services of a library. Video hosting sites also promote community; as video hosting becomes easier, library users could be encouraged to post their own relevant videos.

Thursday, September 23, 2010

Muddiest Point 9/20

     I don't have much to say here, other than to observe that software is updated with such rapidity that by the time I learn half the tricks of Word (or other simple, pedestrian applications) I am forced to relearn everything. 
    I think the theme of the digital era is a collective agreement that we cannot know everything. 
    The death of the polymath....

Saturday, September 18, 2010

9/20 Comment

http://maj66.blogspot.com/2010/09/computer-history-museum.html?showComment=1284840783467#c5634232965411378842

Thursday, September 16, 2010

9/20 Reading Notes

Linux
     My brother-in-law wanted to build me a Linux-run computer, but never got around to it.  Great story, right?
     I think part of the appeal of Linux is the community that has developed around it.  Only Mac users are more fiercely devoted to their machines, though I suspect that has more to do with popular image than anything.  Linux devotees seem to be loyal not only to their operating format, but to the idea of self-made or almost subversive computing.  Linux makes the uncool kids feel cool; like they're getting away with something that we squares don't understand, because we still buy our software.
     I like that there is an operating system that is mostly free of the politics of the market, and that hobbyists have made every bit as good as the commercial industry leaders.

Mac OS X
     I'm amazed that mac has managed to stay so exclusive in their software for so long.  They don't seem to easily play well with other developers' products, yet they maintain a high price-point.  This article did not sell me on OS X, but it is interesting to see an unbiased report comparing operating systems.

Windows
      I always assumed that new versions of Windows came out so regularly to dupe us into buying the updated software.  It seemed almost cruel to force businesses, hospitals and schools to invest in and adopt new pc infrastructure every  two years....
      I don't expect the newest version of Windows to behave (for a novice) any differently than its predecessor, but maybe (aside from shrewd marketing) Microsoft had some good excuses for rolling out  updated operating systems so often: 1. technology really had significantly advanced, 2.  its competitors were right there with them,  3. they had a vested interest in keeping their clientele trained in their most recent versions, and 4. people were willing to pay for it.
       The strength of Windows may be that so many firms compete to build software for it, keeping the field competitive.  Mac,by comparison, is mostly closed to other firms, and I feel like I would need a computer science degree to get the most out of Linux (or a less social high school experience).
    

Muddiest Point 9/13

More of an observation, really: 
Now that we have basically made computers as small as we would like them to be, is the obsession with smaller-is-better computing over?  Is most of the computer hardware we saw this past week on its way out?  Will our kids ever even use a desktop computer?

Saturday, September 11, 2010

Notes, 9/13

Computer Hardware
   I don't claim to understand how my computers work.  I know more now that I ever have, though, and the subject is more approachable.   I've always thought of computers as similar to motor vehicles in their complexity of parts.  The difference to me is that I grasp the basic mechanics of cars, and have even attempted minor repairs for fun and profit.  PCs, on the other hand, always occupied a place of mystery nested somewhere between unified field theory and Area 51.  Perhaps the basic problem is that everything is so small, and all those tiny soldered circuits seem to be quite busily doing nothing at all.
   I think the layered wiki-explanation is a good approach.  I appreciated the breakdowns and links, and actually feel more confident about dissecting my technology the next time it acts out at me.

Moore's Law
    Moore's law (the exponential growth of computer circuitry every two years) does much to explain the rate of change that has marked the tech field for the past eighty years.  It is remarkable that so many firms have stayed competitive in such a rapidly-changing industry.  The R&D costs of keeping up with the market leaders must be enormous.  Someone told me once that the chip in my phone was about ten times more powerful than the technology that landed man on the moon.  I cannot think of another industry that could support production change at this rate. 
    Practical obsolescence is the most striking part of Moore's law.  When you spend millions of dollars developing materials that will be obsolete in two years, it affects everything from supply chain management (you can't allow for much inventory), marketing, and cost forecasting.

Computer Museum Fun
     This was a great virtual field trip.  The time-line was dry, but engaging and could easily stand alone as a primer in computing history.  I particularly enjoyed  the dis-invention of the computer: no one can prove that they invented it, so the rage at being denied such a fantastic patent can be spread out to multiple claimants.
     I did not know about the earliest advances in the field: the Navy requested a computerized flight simulator during WWII.  Even though those MIT slackers couldn't build one until the 50's, it is interesting that the potential of the "adding machine" was recognized so early (GM wanted to design cars with computers in the 50's). 

Thursday, September 2, 2010

Week One Reading Notes

On Clifford Lynch
      Mr. Lynch has taken a broad view of IT Literacy education.  I must agree with him, up to a point.
      I agree with Mr. Lynch that most people learn how to survive with their technology, but not how to thrive with it.  We learn word processing, but are not really taught how to craft those words into a publishable document.  Our creative efforts would be rewarded tenfold if we took the time to learn all the bells and whistles of the simplest software we already own and use.
      The author has advocating a comprehensive scope to IT education.  For example: he seems to think it practical to teach students how their search engines work, not just how to make them work.  In an ideal world, with perfect teachers, learners, and fewer constraints this would be fantastic!  I hope this course doubles (at least) my understanding of IT as it relates to library science, but I don't feel cheated or guilty that I don't know more than I do going into it.

On Content, Not Containers
     I enjoyed this, against all odds.  Here is what I appreciated:
     I hadn't thought before that text had been exclusively married to paper, but is now promiscuously tarting about all over the Internet.  What had previously only appeared in books and physical formats is buzzing through airways and wires, popping up on my phone and laptop. 
     The idea that new technologies hinder our access to information hadn't occurred to me.  It is practically free to transmit information in almost limitless amounts, so providers charge for the access itself.  It makes sense that I am willing to pay for access to an online journal if I cannot read it for free in the library.  The format has cost me money for the convenience of access. The irony is that I probably would never had known the journal existed had I not found it using an online search.
     As we become bombarded with more and more formats, it will take discipline and insight to choose the best information over the flashiest container.  YouTube may be offering the same content as those mid-nineties 'zines we all printed out of our garages, but bright colors and easy accessibility do not make the actual information any better, just easier to use.

On J. Vaughan
     This was a refreshingly practical look into the IT processes and offering of a major academic library.  After reading about all the accommodations UN has made for its students (laptops, multimedia and such), I wonder if those students will graduate with a new understanding of what libraries are all about?  Duke is giving everyone iPods, but will students appreciate that they are more than multimedia delivery systems?
     Mr Vaughan has challenged my understanding of academic library budgets.  As information becomes easier to access, and mostly free to consume, we can shift our library money away from text acquisitions and toward giant computerized cranes that retrieve old books, and toward laptops for everyone!
   

Muddiest Point
     I wonder if all the technology we pump into our libraries will make the librarian obsolete in ten years?  Will society outsource my profession to the lowest bidder with a Skype account?   Will physical libraries be necessary after we digitize all of our materials?  Will we someday be able to research as loudly as we please?