The readings for Unit 12 represent the first experience I've knowingly had with project management. I say "knowingly" because I held corporate jobs for over 10 years before applying to SIRLS, and during that time lived through some technology changes at work. So, obviously, I was on the user end of some project management decisions without really being aware of the underlying process. Now that I see how time-consuming and involved those projects are, I have a better appreciation for the work required to bring new projects to fruition. Looking back, I remember that most of the time employees were irritated when new technologies were rolled out. Employees were used to the existing process, didn't always feel like learning a new system, and were sometimes slow to see the benefits of the new technology. After all that work, I can just imagine the IT people thinking, "What a bunch of ungrateful bastards!"
The reading that I valued the most this week was Cervone's "How not to run a digital library project". Because this subject is new to me, I found his list of "rules" to be a good start for how to view, and approach, project management. Rule #1 struck me almost immediately. Cervone's comment that, "Even when requirements are gathered, they are ignored in the belief that the project team does, in fact, know better what the end user wants," reminded me of a couple instances at work when new systems failed to efficiently do what employees required. For example, I remember a new software program that hid a frequently used function in a location that was cumbersome to access (3 or 4 clicks instead of 2). This happened because the project team was unaware how we, the employees, used that function.
I also like the WAG example in Rule #4. I don't remember seeing this acronym before, but I appreciate it's meaning and understand the pitfalls of using a WAG for planning purposes. On the surface the difference between "effort" and "duration" appears subtle, but it's an important distinction when estimating the time, and cost, of a project.
Another favorite was Rule #8. I certainly see how "scope creep" can quickly undermine a project, leading to cost overruns, time delays, loss of focus, and confusion regarding end user needs. I imagine that the more ambitious the project (and the more people involved), the greater the risk of "scope creep". Cervone's advice to "be flexible, but firm" is well taken. Obviously, flexibility is necessary because some changes are inevitable with big projects, but stray too far, and the final product may please no one. Therefore, if major changes are required, returning to the drawing board to draft a new plan may be warranted.
Monday, August 2, 2010
Wednesday, July 28, 2010
The End Complete
I decided to add DigIn to my SIRLS MA because I know digital collections will play an increasingly important role in the future of libraries and archives. Prior to this course, my only real exposure to the technical side of computing came during IRLS 571 in fall semester 2009. Success in that class encouraged me to pursue further studies, which brought me to 672 this summer. I entered both classes apprehensive that I might not be able to keep up, and afraid that my classmates would be starting well ahead of me in their experience and understanding. Fortunately, my fears were largely unfounded.
I liken my experience so far to learning a new language (which is essentially what we’re doing). Eleven weeks ago I understood basic computer functions – hardware, software, networks, etc. Today that understanding has been reinforced by an additional layer; namely, the LAMP stack. Except for phpMyAdmin, I had actually heard of the other three components before this summer, but had never used them. Before 672 I had no real understanding of how digital collections were designed or implemented. I knew databases formed a critical component, but couldn’t articulate much beyond that. Today I have an elementary appreciation for how they work and the underlying architecture. Obviously, I am far from prepared to actually apply this limited knowledge to a real-world project, but I know enough to feel attuned to the language and characteristics of digital collections. I think this stuff is new enough to me that I haven’t had any major changes in perspective yet, probably because my initial perspective was so undeveloped. But, if I’ve gained a greater appreciation for one aspect, it’s database design. I’ve toyed with Access a little in the past, but it wasn’t until our units on databases that I really began to realize how complicated database design really is.
This was my first DigIn class so, of course, there’s a long way to go. And I’m not going to lie, I still feel a little apprehensive about 675 this fall. I often worry that one week I won’t get the material, and I’ll fall behind and never catch up. But, at the same time, I’m excited to continue. There’s a certain satisfaction in being able to make a computer do what you want, especially when the results are displayed in a browser. Somehow, browser displays seem more tangible. So I’m going to call this class a success, hope I don’t forget what I learned over a couple weeks break, and pick up where we left off in 675.
I liken my experience so far to learning a new language (which is essentially what we’re doing). Eleven weeks ago I understood basic computer functions – hardware, software, networks, etc. Today that understanding has been reinforced by an additional layer; namely, the LAMP stack. Except for phpMyAdmin, I had actually heard of the other three components before this summer, but had never used them. Before 672 I had no real understanding of how digital collections were designed or implemented. I knew databases formed a critical component, but couldn’t articulate much beyond that. Today I have an elementary appreciation for how they work and the underlying architecture. Obviously, I am far from prepared to actually apply this limited knowledge to a real-world project, but I know enough to feel attuned to the language and characteristics of digital collections. I think this stuff is new enough to me that I haven’t had any major changes in perspective yet, probably because my initial perspective was so undeveloped. But, if I’ve gained a greater appreciation for one aspect, it’s database design. I’ve toyed with Access a little in the past, but it wasn’t until our units on databases that I really began to realize how complicated database design really is.
This was my first DigIn class so, of course, there’s a long way to go. And I’m not going to lie, I still feel a little apprehensive about 675 this fall. I often worry that one week I won’t get the material, and I’ll fall behind and never catch up. But, at the same time, I’m excited to continue. There’s a certain satisfaction in being able to make a computer do what you want, especially when the results are displayed in a browser. Somehow, browser displays seem more tangible. So I’m going to call this class a success, hope I don’t forget what I learned over a couple weeks break, and pick up where we left off in 675.
Sunday, July 25, 2010
Databases, week two
Let me first say, I like SQL. I've struggled to learn certain aspects of it, but so far, I'm enjoying learning it and probably find creating databases the most fun thing we've done in class. That said there is still plenty I need to work on.
Setting up tables and attributes using Webmin and phpMyAdmin was easy thanks to the GUI. I remain an advocate for GUI's whenever possible, and this unit served to reinforce my prejudice. The bulk of my comments, therefore, will be reserved for MySQL monitor.
As usual, the UACBT tutorials were quite helpful and easy to follow, and I found the print version from W3Schools to be useful for the dropbox assignment. Following along with the tutorials is not hard; likewise, the assignment instructions were easy to follow. The difficulty lies in translating the lessons into self-created syntax. As I started to work on the dropbox assignment, I found it difficult to construct the proper syntax from memory despite having seen it only moments before. This is, of course, like everything in life. It's easy to read a good book; much harder to write one. Fortunately, it did become easier as I proceeded through the assignment, a trend I expect will continue with practice.
Multiple table queries, in particular, remain taxing to construct correctly. And I still have only a tenuous grasp of inner, left, and right join. I understand roughly what they do, but it's still a bit fuzzy. Nonetheless, this part of the course seems very translatable to what I might do in the future, and I look forward to the possibility of creating databases for real-world applications.
Setting up tables and attributes using Webmin and phpMyAdmin was easy thanks to the GUI. I remain an advocate for GUI's whenever possible, and this unit served to reinforce my prejudice. The bulk of my comments, therefore, will be reserved for MySQL monitor.
As usual, the UACBT tutorials were quite helpful and easy to follow, and I found the print version from W3Schools to be useful for the dropbox assignment. Following along with the tutorials is not hard; likewise, the assignment instructions were easy to follow. The difficulty lies in translating the lessons into self-created syntax. As I started to work on the dropbox assignment, I found it difficult to construct the proper syntax from memory despite having seen it only moments before. This is, of course, like everything in life. It's easy to read a good book; much harder to write one. Fortunately, it did become easier as I proceeded through the assignment, a trend I expect will continue with practice.
Multiple table queries, in particular, remain taxing to construct correctly. And I still have only a tenuous grasp of inner, left, and right join. I understand roughly what they do, but it's still a bit fuzzy. Nonetheless, this part of the course seems very translatable to what I might do in the future, and I look forward to the possibility of creating databases for real-world applications.
Friday, July 16, 2010
Databases, week one
I actually found this week's topic enjoyable, even though I'm far from understanding everything. I believe I can decipher a basic ERD, although drawing one from scratch remains a bit more difficult. The one I designed for the dropbox assignment is very simple, yet I'm still unsure I included every possible relationship. Bridge tables and the "O" relationship will take some additional work to fully understand. For instance, I think all of the tables in my ERD can stand alone (a country, attraction, and photographer can exist independently of each other), yet I hesitated to include any "O's" in my diagram. Not sure why...
Level one normalization is pretty easy; level three I'm still not clear on. In fact, levels two and three look almost the same to me. Both normalization and ERD are manageable with simple databases of 3-4 tables, but one can easily imagine how difficult relationship diagrams and normalization will be for more complicated databases.
Same thing with SQL... basics are easy, but to do any "heavy lifting" is going to require time, patience, and practice. The tutorials (especially the UACBT videos) are great, but I'm a long way from sitting down and creating a mature database from scratch. There's nothing surprising here - the basics are easy for most things in life, while achieving demonstrable proficiency requires greater dedication. Fortunately, I find the idea of creating databases for digital collections intriguing, and might pursue something of this nature for my Capstone project.
Level one normalization is pretty easy; level three I'm still not clear on. In fact, levels two and three look almost the same to me. Both normalization and ERD are manageable with simple databases of 3-4 tables, but one can easily imagine how difficult relationship diagrams and normalization will be for more complicated databases.
Same thing with SQL... basics are easy, but to do any "heavy lifting" is going to require time, patience, and practice. The tutorials (especially the UACBT videos) are great, but I'm a long way from sitting down and creating a mature database from scratch. There's nothing surprising here - the basics are easy for most things in life, while achieving demonstrable proficiency requires greater dedication. Fortunately, I find the idea of creating databases for digital collections intriguing, and might pursue something of this nature for my Capstone project.
Sunday, July 11, 2010
Technology plans: the simpler, the better.
I wish to comment on three articles from this week concerning technology plans; namely, the Whittaker, Chabrow, and Schuyler articles. I found each interesting for different reasons, and will briefly relate what I took away from each.
First, the Whittaker article. I don't dispute that many technology implementations fail to materialize, but I found the research methodology in this article suspect. Obviously, wasted time and resources on poorly planned technology initiatives are a major issue, however, this article left me unconvinced that it's taken very seriously by many institutions, particularly large ones. For example, only 14% of the research surveys were returned? That is not a very inspiring number. Does that mean 86% of the recipients don't think it's a serious issue or problem, and so didn't take the time to participate? Also, 1450 surveys were sent, but only 176 "arrived in time to be analyzed for this report". So, really, only 12% of the surveys sent were used. Apparently, of these, 61% reported a failed IT initiative, but I wonder if that failure provided motivation to participate in the survey. As anyone with customer service experience knows, customers are much more likely to share a bad experience than a good one.
Also, the survey was sent to "chief executives" but many of the respondent comments blame upper management for IT failures. Would chief executives really blame themselves for the failures? I doubt it. I think the surveys were passed to other (unidentified) parties for completion. Bottom line - I didn't find this article very persuasive in convincing me that most institutions are distressed by the success rate of their IT projects.
Even though the Chabrow article focused on government IT plans, I took interest in several points made in the article. First, the idea that it was preferable to "fail fast" on IT projects that appear to be off-track. Recognition that a plan isn't working, and taking steps to quickly change or abandon it, will save time and money and, I believe, is good advice. I believe there should be no sacred-cows with IT projects. Don't throw good money, or good time, after bad. Admit it's not working, re-evaluate the need and plan, and change or dump it.
Chabrow cites frequent changes in management as one issue that can lead to failed IT projects. Managers inheriting a project they weren't originally involved in planning is a recipe for setbacks and failure. The risks are obvious: lack of interest, wanting to make changes, different priorities, etc. Continuity of management and staff is critical for success, particularly for long-term projects.
One thing that can help raise the success rate of projects is to implement them with "incremental steps and rollouts that deliver benefits along the way". I think this is great advice, especially for large projects. Small rollouts are less likely to meet with problems or resistance from staff, and allow for small successes over shorter time periods than waiting years for some big project to reach completion.
Finally, I really enjoyed the Schuyler article. His assertion that technology plans are a "political document" is true. They are often implemented by upper management because they're necessary for grant applications, but often are created without input from the IT department. Often the authors of technology plans aren't the same people that actually implement them. And his advice that technology plans are best kept vague rings true. Many technology plans look years ahead, but things happen. Recessions happen. Technology changes. Needs change. Overly specific technology plans are exactly the ones most likely to fail.
Eventually I hope to have the opportunity to contribute to a technology plan. At least, I'll have to read them, because my future career will probably involve some grant writing. In a sense, technology plans can seem like a necessary evil - a loop one has to go through to obtain funding for projects. As I mentioned above, I really think keeping them flexible and vague is good advice. As quickly as technology and business needs change, I think the most useful technology plan is a flexible one.
First, the Whittaker article. I don't dispute that many technology implementations fail to materialize, but I found the research methodology in this article suspect. Obviously, wasted time and resources on poorly planned technology initiatives are a major issue, however, this article left me unconvinced that it's taken very seriously by many institutions, particularly large ones. For example, only 14% of the research surveys were returned? That is not a very inspiring number. Does that mean 86% of the recipients don't think it's a serious issue or problem, and so didn't take the time to participate? Also, 1450 surveys were sent, but only 176 "arrived in time to be analyzed for this report". So, really, only 12% of the surveys sent were used. Apparently, of these, 61% reported a failed IT initiative, but I wonder if that failure provided motivation to participate in the survey. As anyone with customer service experience knows, customers are much more likely to share a bad experience than a good one.
Also, the survey was sent to "chief executives" but many of the respondent comments blame upper management for IT failures. Would chief executives really blame themselves for the failures? I doubt it. I think the surveys were passed to other (unidentified) parties for completion. Bottom line - I didn't find this article very persuasive in convincing me that most institutions are distressed by the success rate of their IT projects.
Even though the Chabrow article focused on government IT plans, I took interest in several points made in the article. First, the idea that it was preferable to "fail fast" on IT projects that appear to be off-track. Recognition that a plan isn't working, and taking steps to quickly change or abandon it, will save time and money and, I believe, is good advice. I believe there should be no sacred-cows with IT projects. Don't throw good money, or good time, after bad. Admit it's not working, re-evaluate the need and plan, and change or dump it.
Chabrow cites frequent changes in management as one issue that can lead to failed IT projects. Managers inheriting a project they weren't originally involved in planning is a recipe for setbacks and failure. The risks are obvious: lack of interest, wanting to make changes, different priorities, etc. Continuity of management and staff is critical for success, particularly for long-term projects.
One thing that can help raise the success rate of projects is to implement them with "incremental steps and rollouts that deliver benefits along the way". I think this is great advice, especially for large projects. Small rollouts are less likely to meet with problems or resistance from staff, and allow for small successes over shorter time periods than waiting years for some big project to reach completion.
Finally, I really enjoyed the Schuyler article. His assertion that technology plans are a "political document" is true. They are often implemented by upper management because they're necessary for grant applications, but often are created without input from the IT department. Often the authors of technology plans aren't the same people that actually implement them. And his advice that technology plans are best kept vague rings true. Many technology plans look years ahead, but things happen. Recessions happen. Technology changes. Needs change. Overly specific technology plans are exactly the ones most likely to fail.
Eventually I hope to have the opportunity to contribute to a technology plan. At least, I'll have to read them, because my future career will probably involve some grant writing. In a sense, technology plans can seem like a necessary evil - a loop one has to go through to obtain funding for projects. As I mentioned above, I really think keeping them flexible and vague is good advice. As quickly as technology and business needs change, I think the most useful technology plan is a flexible one.
Sunday, July 4, 2010
XML - same as HTML, only different
I proceeded to familiarize myself with XML this week using the tools recommended in the assignment - namely, the Wikipedia articles, w3schools.com tutorials, and the Mark Long Introduction to XML videos. Last week's lesson on HTML certainly made XML easier because the languages are similar. Each of the tools mentioned above were helpful. In fact, at this early stage, I imagine any tool would be useful to a novice such as myself.
As mentioned before in this course, the Wikipedia articles, while generally current and thoughtful, often incorporate more detail than I'm prepared to appreciate at this point. Therefore, I often read the first third or half of each article to understand the basics, and usually find myself glazing over by the end. On the bright side, I do understand more of each article than I would have in early May, so I'm definitely learning (albeit slowly).
The w3schools and Mark Long videos were quite helpful. The w3 lessons work because you read them at your own pace, and are easy to go back to. Plus, they're typically concise and to-the-point. The w3 lessons provided a good foundation into the Long videos, which are a bit more detailed. I have not yet viewed the DTD and Schemas sections of the videos, however, but plan to return to these later. Part of my hesitation is simply that I don't understand the difference between these, although I'm sure the videos will help clarify my cloudiness.
The actual XML document was not hard to write. I did run into a minor problem trying to code a URL into an element, but that was resolved by using '& amp ;' in place of an &. I am curious how to code an actual URL link into XML, like "a href" for HTML. I haven't yet found how to do this, although once I delve deeper into the tutorials my question will likely be answered.
As mentioned before in this course, the Wikipedia articles, while generally current and thoughtful, often incorporate more detail than I'm prepared to appreciate at this point. Therefore, I often read the first third or half of each article to understand the basics, and usually find myself glazing over by the end. On the bright side, I do understand more of each article than I would have in early May, so I'm definitely learning (albeit slowly).
The w3schools and Mark Long videos were quite helpful. The w3 lessons work because you read them at your own pace, and are easy to go back to. Plus, they're typically concise and to-the-point. The w3 lessons provided a good foundation into the Long videos, which are a bit more detailed. I have not yet viewed the DTD and Schemas sections of the videos, however, but plan to return to these later. Part of my hesitation is simply that I don't understand the difference between these, although I'm sure the videos will help clarify my cloudiness.
The actual XML document was not hard to write. I did run into a minor problem trying to code a URL into an element, but that was resolved by using '& amp ;' in place of an &. I am curious how to code an actual URL link into XML, like "a href" for HTML. I haven't yet found how to do this, although once I delve deeper into the tutorials my question will likely be answered.
Sunday, June 27, 2010
HTML's not so bad...
My first real experience with HTML came during 504 last summer. I am certainly still a novice, however, to this point find HTML fairly easy to work with and not intimidating. Of course, I'm basing this on the very simple websites we've been required to produce to this point, so my rosy assessment could quickly change if future assignments involve complicated HTML coding. But, so far at least, I'm having some fun with it.
This week I focused on reviewing the Powerpoint from 504, and followed it in creating my unit 6 web page (which looks quite similar to my 504 page). So far in SIRLS, I've been required to produce a web page about every 6 months. This is often enough to remember some basics, but too infrequent to become comfortable with the process - particularly in posting them to the U-System account. I'm sure DigIn will afford many future opportunities to create websites, so the process will undoubtedly become more familiar. Right now, for some reason, I always lack confidence that the transfer of files to the U-System account will go smoothly, and fear the page will be missing elements that are present when viewing the document during creation. Images, especially, I'm afraid won't be transferred properly and I'll be left with a page full of the dreaded "X" symbol.
I also viewed lessons from the w3schools.com website. These are helpful, easy to understand, valuable for reinforcing what I already know, and present new concepts in a manner that is accessible to the layperson. So far I'm sticking to the basic lessons, but plan to revisit the more advanced ones as we proceed through the course. A couple little things surprised me. For instance, future versions of HTML won't allow you to skip certain end tags that can be missing now (although it's not recommended). This only surprised me because I imagined rules might become more flexible as the code evolved, not less. Also, I'm still not clear on the differences between HTML, XHTML, and XML. From what I've seen, the code for each looks fairly similar. I believe XML prioritizes data content over style, although this is certainly an oversimplification.
This week I focused on reviewing the Powerpoint from 504, and followed it in creating my unit 6 web page (which looks quite similar to my 504 page). So far in SIRLS, I've been required to produce a web page about every 6 months. This is often enough to remember some basics, but too infrequent to become comfortable with the process - particularly in posting them to the U-System account. I'm sure DigIn will afford many future opportunities to create websites, so the process will undoubtedly become more familiar. Right now, for some reason, I always lack confidence that the transfer of files to the U-System account will go smoothly, and fear the page will be missing elements that are present when viewing the document during creation. Images, especially, I'm afraid won't be transferred properly and I'll be left with a page full of the dreaded "X" symbol.
I also viewed lessons from the w3schools.com website. These are helpful, easy to understand, valuable for reinforcing what I already know, and present new concepts in a manner that is accessible to the layperson. So far I'm sticking to the basic lessons, but plan to revisit the more advanced ones as we proceed through the course. A couple little things surprised me. For instance, future versions of HTML won't allow you to skip certain end tags that can be missing now (although it's not recommended). This only surprised me because I imagined rules might become more flexible as the code evolved, not less. Also, I'm still not clear on the differences between HTML, XHTML, and XML. From what I've seen, the code for each looks fairly similar. I believe XML prioritizes data content over style, although this is certainly an oversimplification.
Sunday, June 20, 2010
Someday there will be a pill for this...
During the 2009 winter break I read Gibbon's Decline and Fall of the Roman Empire. I have a BA in History (with an emphasis on Ancient Greece) with a Classics minor, so I've read a lot of books like this over the years. Yet I distinctly recall, after finishing the 1500 page volume, holding it closed in my hands and asking myself, "How much of this do you remember?" Of all those pages, what I clearly recall could probably be reduced to 10 pages. All that information, and I likely retained only 1/150th of it - a paltry, and depressing, amount.
But I guess all learning is like that. What we recall is often infinitesimal compared to what we actually encounter. Memories of travel will help illustrate my point. I've been fortunate to visit 21 countries so far, and hope to explore many more, yet if the sum of my memories were transferred onto a DVD, the total running time would be only a couple of hours. Fleeting bits of this; pieces of that. And many still shots.
I enjoyed the Felder and Soloman article because I saw elements of myself in each learning style presented. As they mention, most people incorporate multiple styles in their quest for knowledge. In the above examples, my BA was accomplished primarily through reading, while travel promotes learning by seeing and doing. Although I'm comfortable with various learning strategies, I identify most closely with the reflective-sensing-visual-sequential combination.
The information presented in unit 5 (like many of our units) incorporated visual, verbal, and active learning styles. I especially liked the Warriors of the Net video, which enhanced my understanding of the material through a fun visual presentation. The lecture itself was clear and understandable, as were most of the Wikipedia articles. However, while I never felt lost in the lecture, several of the Wikipedia articles went beyond my current level of understanding. Felder and Soloman suggest that most people learn best with a combination of visual and verbal strategies, and I think the material this week supports their hypothesis. I do, however, look forward to the day when scientists develop a pill which "downloads" information directly into our brains. Imagine learning a new language by simply swallowing a pill! Of course, "downloading" Greek will have predictable side effects including headache, flushing, and delayed back ache. And, if Greek gives you an erection lasting more than four hours, you should see your doctor immediately.
But I guess all learning is like that. What we recall is often infinitesimal compared to what we actually encounter. Memories of travel will help illustrate my point. I've been fortunate to visit 21 countries so far, and hope to explore many more, yet if the sum of my memories were transferred onto a DVD, the total running time would be only a couple of hours. Fleeting bits of this; pieces of that. And many still shots.
I enjoyed the Felder and Soloman article because I saw elements of myself in each learning style presented. As they mention, most people incorporate multiple styles in their quest for knowledge. In the above examples, my BA was accomplished primarily through reading, while travel promotes learning by seeing and doing. Although I'm comfortable with various learning strategies, I identify most closely with the reflective-sensing-visual-sequential combination.
The information presented in unit 5 (like many of our units) incorporated visual, verbal, and active learning styles. I especially liked the Warriors of the Net video, which enhanced my understanding of the material through a fun visual presentation. The lecture itself was clear and understandable, as were most of the Wikipedia articles. However, while I never felt lost in the lecture, several of the Wikipedia articles went beyond my current level of understanding. Felder and Soloman suggest that most people learn best with a combination of visual and verbal strategies, and I think the material this week supports their hypothesis. I do, however, look forward to the day when scientists develop a pill which "downloads" information directly into our brains. Imagine learning a new language by simply swallowing a pill! Of course, "downloading" Greek will have predictable side effects including headache, flushing, and delayed back ache. And, if Greek gives you an erection lasting more than four hours, you should see your doctor immediately.
Sunday, June 13, 2010
A little information is dangerous...
Thanks to concise and efficient instructions, this week's configuration of users and groups went smoothly. As I reported last week, I have yet to encounter any major installation obstacles (fingers crossed hoping this continues). I was expecting to be able to logout of the TightVNC terminal and login as a different user (like in VM), but understand why this isn't possible. I still prefer GUI platforms for most applications, although I found some value in the CLI this week.
My issue is not with following directions, or installing applications. My issue is the shallow level of understanding I have for all the things we've done to this point. I comprehend the basics of CLI, permissions, commands, VPN, VMware, etc. but lack the confidence that comes with a depth of understanding. Translating the permissions, for example, is still giving me a bit of trouble. I need to spend more time looking at those, but my angst does not end there.
Today is Sunday... I completed the assignments for this unit on Thursday, and already I have to look back at screens and notes to remember what each of the 3 methods even looked like. It reminds me of the kind of haziness one experiences when thinking back to vacations taken as a child, except this was only 3 days ago! So I'm frustrated that I'm not remembering things with more clarity. Practice will help (something I said two weeks ago), but each week presents new information and challenges and I'm not sure I'll ever feel on top of the content. There's an element of fear in this because I don't want to get behind. I apologize if this post is slightly off-topic from the blog instructions for the week, but this sense of anxiety is my primary and overriding concern right now.
My issue is not with following directions, or installing applications. My issue is the shallow level of understanding I have for all the things we've done to this point. I comprehend the basics of CLI, permissions, commands, VPN, VMware, etc. but lack the confidence that comes with a depth of understanding. Translating the permissions, for example, is still giving me a bit of trouble. I need to spend more time looking at those, but my angst does not end there.
Today is Sunday... I completed the assignments for this unit on Thursday, and already I have to look back at screens and notes to remember what each of the 3 methods even looked like. It reminds me of the kind of haziness one experiences when thinking back to vacations taken as a child, except this was only 3 days ago! So I'm frustrated that I'm not remembering things with more clarity. Practice will help (something I said two weeks ago), but each week presents new information and challenges and I'm not sure I'll ever feel on top of the content. There's an element of fear in this because I don't want to get behind. I apologize if this post is slightly off-topic from the blog instructions for the week, but this sense of anxiety is my primary and overriding concern right now.
Monday, June 7, 2010
Bargaining - the third stage of grief
I hesitate to mention this for fear of jinxing my good luck, but so far all my downloads and installations have worked to perfection. In addition, I'm pleased to report that I have encountered no issues when following the assignment directions, and each command and process presented by Prof. Fulton has worked as advertised. This has been a boon to my 672 morale, as I was quite concerned at the outset about possible system crashes and software failures. Ironically, the only computer problem I've had is with the display driver which, despite a recent update, continues to occasionally fail leaving my screen black and the mouse inoperable.
The CLI commands continue to provoke the most frustration - not because they don't work, but because remembering all of them seems impossible. I mention in my discussion post that I'm still not sure which commands to prioritize, but it may be a non-issue. Continued practice makes each more memorable, and I'm sure practice will likewise reveal which commands are most useful and ubiquitous. At least I'm losing my fear of the command line, which I consider a minor victory.
I enjoyed the VIM tutorial, and was fascinated by how powerful a few simple commands could be. After studying the tutorial, I understand for the first time why CLI advocates believe commands are faster than using a GUI. Of course, I realize the point of the course is not to persuade us to use CLI in our everyday computing, but I am slowly developing a greater appreciation (acceptance?) for CLI. As I alluded to above, however, most of the tutorial commands have already evacuated my short-term memory. Undoubtedly, additional practice will be required to use them proficiently. The hidden files interest me and I find myself asking, "What are they hiding?" Unless I'm missing something, we still haven't opened many files, so I remain curious about their content. For instance, how are text files presented upon opening? In a word-processor type format like Word, or in the same environment that the tutorial utilizes?
The configuration went smoothly although some of it, like the alias', I'm unsure when I'll use. I don't recall consciously configuring a computer before, although I must have each time a got a new one. Perhaps, because I would have used a GUI, I was unaware that my actions qualified as "configuration". In any case, the step-by-step instructions were easy to follow and worked.
All told, everything has worked well during the first three weeks. My biggest question remains, "What relevance does all this have to my future career as a librarian/archivist?" Apparently the answers to such questions will be revealed over the next few weeks, and I anxiously await the coming revelation.
The CLI commands continue to provoke the most frustration - not because they don't work, but because remembering all of them seems impossible. I mention in my discussion post that I'm still not sure which commands to prioritize, but it may be a non-issue. Continued practice makes each more memorable, and I'm sure practice will likewise reveal which commands are most useful and ubiquitous. At least I'm losing my fear of the command line, which I consider a minor victory.
I enjoyed the VIM tutorial, and was fascinated by how powerful a few simple commands could be. After studying the tutorial, I understand for the first time why CLI advocates believe commands are faster than using a GUI. Of course, I realize the point of the course is not to persuade us to use CLI in our everyday computing, but I am slowly developing a greater appreciation (acceptance?) for CLI. As I alluded to above, however, most of the tutorial commands have already evacuated my short-term memory. Undoubtedly, additional practice will be required to use them proficiently. The hidden files interest me and I find myself asking, "What are they hiding?" Unless I'm missing something, we still haven't opened many files, so I remain curious about their content. For instance, how are text files presented upon opening? In a word-processor type format like Word, or in the same environment that the tutorial utilizes?
The configuration went smoothly although some of it, like the alias', I'm unsure when I'll use. I don't recall consciously configuring a computer before, although I must have each time a got a new one. Perhaps, because I would have used a GUI, I was unaware that my actions qualified as "configuration". In any case, the step-by-step instructions were easy to follow and worked.
All told, everything has worked well during the first three weeks. My biggest question remains, "What relevance does all this have to my future career as a librarian/archivist?" Apparently the answers to such questions will be revealed over the next few weeks, and I anxiously await the coming revelation.
Monday, May 31, 2010
The command line tolls for thee.
This week's command line lessons reminded me a lot of math - and why I'm not good at that, either. It seemed easy at first, and even a little fun. A few lessons later (especially by the IO Redirection tutorial) I was getting lost in a sea of commands and arguments. So let's take a quick look at what worked, and what didn't.
Logging into the VNC was problematic at first because, although I had installed in correctly, I was unaware that I needed to first log in to vpn.arizona.edu before connecting to the VNC. Fortunately, the discussion board included a thread discussing this, after which I was able to quickly and reliably log in.
I first viewed Arthur Griffith's lessons, and was surprised to find that many of the commands didn't work on the VNC home directory files. For instance, the ls command lists several directories (downloads, music, pictures, etc.), but going further (for example, ls pictures) always seemed to return a message stating the file or directory didn't exist. I found this surprising, and frustrating. Nonetheless, basic commands (cd, ls, rm, mkdir, etc.) were clear, easy, and represented the high-water mark of my understanding for this week. Adding and changing directories, listing files, removing files - all of these tasks I remembered quickly and required little practice.
My outlook remained bright through the first few linuxcommand.org lessons as well. As stated above, however, things changed when I got to the IO Redirection lesson and after. I'm sure we've all had moments in life when you see or read something new and difficult, and then sit back and ask yourself, "What did I just read?" because it's a blur and you can't remember anything. Well, this was one of those moments. From that section on, I was lost. Following are some of the highlights (or lowlights) of these sections.
To begin, the pwd command seemed to return the same information (which directory you're in) as already appears before the $ symbol. Is this correct? Also, I know cp and mv mean copy and move respectively, however, they seem similar enough to remain confusing. I understand the concept of wildcats, but the range of symbols and functions involved in utilizing them remains baffling and will require additional practice to use correctly. Additionally, I'm having difficulty comprehending the ideas of standard output/input, and got caught in a loop of >'s when I experimented with them. Needless to say, I'm still not sure what these do - let alone how to explain them. Finally, I got the pipe command "ls -1 less" to work, but all it did was return the list of directory files in alphabetical order. Pipes are one more thing on my list that need additional work.
I mentioned in my discussion post that I can't imagine ever using CLI to perform routine computing functions, and that viewpoint remains unchanged. I practice interface monogamy, and am committed to my GUI. However, I also mentioned that certain commands, like ping, can only be performed through a CLI, and I do find some value in learning basic commands. This week was a great start, but I'm going to need considerable additional practice if we're going to be using CLI in earnest. My plan for tomorrow is to print out the linuxcommand.org lessons so I have them available for easy reference as we go forward.
Logging into the VNC was problematic at first because, although I had installed in correctly, I was unaware that I needed to first log in to vpn.arizona.edu before connecting to the VNC. Fortunately, the discussion board included a thread discussing this, after which I was able to quickly and reliably log in.
I first viewed Arthur Griffith's lessons, and was surprised to find that many of the commands didn't work on the VNC home directory files. For instance, the ls command lists several directories (downloads, music, pictures, etc.), but going further (for example, ls pictures) always seemed to return a message stating the file or directory didn't exist. I found this surprising, and frustrating. Nonetheless, basic commands (cd, ls, rm, mkdir, etc.) were clear, easy, and represented the high-water mark of my understanding for this week. Adding and changing directories, listing files, removing files - all of these tasks I remembered quickly and required little practice.
My outlook remained bright through the first few linuxcommand.org lessons as well. As stated above, however, things changed when I got to the IO Redirection lesson and after. I'm sure we've all had moments in life when you see or read something new and difficult, and then sit back and ask yourself, "What did I just read?" because it's a blur and you can't remember anything. Well, this was one of those moments. From that section on, I was lost. Following are some of the highlights (or lowlights) of these sections.
To begin, the pwd command seemed to return the same information (which directory you're in) as already appears before the $ symbol. Is this correct? Also, I know cp and mv mean copy and move respectively, however, they seem similar enough to remain confusing. I understand the concept of wildcats, but the range of symbols and functions involved in utilizing them remains baffling and will require additional practice to use correctly. Additionally, I'm having difficulty comprehending the ideas of standard output/input, and got caught in a loop of >'s when I experimented with them. Needless to say, I'm still not sure what these do - let alone how to explain them. Finally, I got the pipe command "ls -1 less" to work, but all it did was return the list of directory files in alphabetical order. Pipes are one more thing on my list that need additional work.
I mentioned in my discussion post that I can't imagine ever using CLI to perform routine computing functions, and that viewpoint remains unchanged. I practice interface monogamy, and am committed to my GUI. However, I also mentioned that certain commands, like ping, can only be performed through a CLI, and I do find some value in learning basic commands. This week was a great start, but I'm going to need considerable additional practice if we're going to be using CLI in earnest. My plan for tomorrow is to print out the linuxcommand.org lessons so I have them available for easy reference as we go forward.
Monday, May 24, 2010
Free Ubuntu beginner's guide on ubuntuforums.org
While perusing ubuntuforums.org today, I came across a thread entitled, "Free Beginner's Guide" in the Absolute Beginner Talk forum. The tread provides a link to a free booklet by Keir Thomas (author of Ubuntu Kung Fu) called, "Ubuntu Pocket Guide and Reference: A concise companion for day-to-day Ubuntu use". 92 pages of responses praised the usefulness of this book for new Ubuntu users and, as a beginner myself, I decided to follow the link to Google Book Search where the book is available for free. (http://books.google.com/books?id=kHLlJzI6L20C&printsec=frontcover#v=onepage&q&f=false) At first glance, the book looks quite useful for beginners, and I've already saved it to my "favorites" for future reference. Chapter topics include installing, configuring, managing, and securing Ubuntu systems. For the purposes of this post, I will limit my comments to information contained in the Introduction.
The introduction provides a brief history of Linux, including the role Richard Stallman had in designing the original Free Software operating system (called GNU), and Linus Torvalds later kernel which took on the name Linux (to the displeasure of Stallman). Ubuntu is one of hundreds of Linux versions (called distributions); others include Red Hat and SUSE. As the book notes, "This variety is possible because of the freedom allowed by Free Software - anybody can take the source code and make their own version." (Thomas, 2009)
Thomas identifies 3 areas where Ubuntu excels compared to other Linux distributions. These are a focus on desktop users, ease of use, and the Ubuntu philosophy and community. While most Linux distributions can be used on desktops and servers, Ubuntu takes special care to ensure a pleasant desktop user experience. In fact, it was designed specifically to compete with the dominant Microsoft Windows operating system.
Ubuntu also strives to offer a positive philosophy and community for its users. The philosophy is centered around the idea that the source code and software is free and available for modification to all users. This is not fundamentally different from most Linux versions, however, Ubuntu has remained true to this principle where other distributions eventually included propriety programs or limited distribution. In addition, the community that supports Ubuntu (particularly through ubuntuforums.org) has remained dedicated to providing a positive desktop experience.
Ease of use is another area where Ubuntu differentiates itself from other Linux versions. In fact, Ubuntu calls itself the "Linux for human beings". Rather than being overly technical to operate, Ubuntu can be used without complete reliance on the command line interface, and is relatively easy to install and update. While still a powerful tool for the "techie" community, Ubuntu's focus on desktop functionality makes this version of Linux accessible to a more casual audience. And Ubuntu offers all the applications expected of commercial operating systems, such as word processing, web browsing, image-editing, and music playback.
Thomas' book is a great resource for the beginning Ubuntu user. I expect to consult it frequently as we begin using Ubuntu in earnest - a prospect which I find increasingly exciting the more I learn about the Linux operating system.
The introduction provides a brief history of Linux, including the role Richard Stallman had in designing the original Free Software operating system (called GNU), and Linus Torvalds later kernel which took on the name Linux (to the displeasure of Stallman). Ubuntu is one of hundreds of Linux versions (called distributions); others include Red Hat and SUSE. As the book notes, "This variety is possible because of the freedom allowed by Free Software - anybody can take the source code and make their own version." (Thomas, 2009)
Thomas identifies 3 areas where Ubuntu excels compared to other Linux distributions. These are a focus on desktop users, ease of use, and the Ubuntu philosophy and community. While most Linux distributions can be used on desktops and servers, Ubuntu takes special care to ensure a pleasant desktop user experience. In fact, it was designed specifically to compete with the dominant Microsoft Windows operating system.
Ubuntu also strives to offer a positive philosophy and community for its users. The philosophy is centered around the idea that the source code and software is free and available for modification to all users. This is not fundamentally different from most Linux versions, however, Ubuntu has remained true to this principle where other distributions eventually included propriety programs or limited distribution. In addition, the community that supports Ubuntu (particularly through ubuntuforums.org) has remained dedicated to providing a positive desktop experience.
Ease of use is another area where Ubuntu differentiates itself from other Linux versions. In fact, Ubuntu calls itself the "Linux for human beings". Rather than being overly technical to operate, Ubuntu can be used without complete reliance on the command line interface, and is relatively easy to install and update. While still a powerful tool for the "techie" community, Ubuntu's focus on desktop functionality makes this version of Linux accessible to a more casual audience. And Ubuntu offers all the applications expected of commercial operating systems, such as word processing, web browsing, image-editing, and music playback.
Thomas' book is a great resource for the beginning Ubuntu user. I expect to consult it frequently as we begin using Ubuntu in earnest - a prospect which I find increasingly exciting the more I learn about the Linux operating system.
Subscribe to:
Posts (Atom)