H. Peter Alesso.
e-Video: Producing Internet Video as Broadband Technologies Converge.
Boston: Addison-Wesley, 2000.
paper, 290 p. and CD-ROM, ISBN 0-201-70314-9, US$44.95.
Electronic-Video, or "e-Video" as in the title, is the continuous display of full-motion digital images which are now beginning to reach the humble desktop PC. This has ignited an explosion of applications, from interactive multimedia products to video telephony devices and services. e-Video covers all of the main aspects of digital video technology avoiding the overuse of heavy mathematical theory. It explains the basics of this emerging field, including the technology, the products, the standards and the countless software and hardware possibilities.
The book it is divided into four main sections starting out with Part 1: Video Opportunity which covers "Bandwidth" and "Internet Video Opportunities", Part 2: Video Production covering "Production, Capturing and Editing Video Content", Part 3: Video Compression covering "Video Compression", "RealNetworks and SMIL", "Windows Media, QuickTime and other formats", "MPEG Streaming" and finally Part 4: Video Delivery, covering "High-speed Networks Prepare Video", "Server Requirements", "Live Webcasts" and "Future Strategies for Video" in 11 chapters, not including appendices or the introduction.
As you may have deduced so far, I had high hopes when I began reading e-Video. My hopes though were short lived, the more I explored the innards of the book and the accompanying CD-ROM examples. H. Peter Alesso has done an adequate job of providing the reader with a good overview of this emerging technology, though it could have been so much better if more attention was paid to the final CD-ROM and the accompanying code samples before publishing. I know that gremlins get into projects, even mine, but these errors should have been cleaned up. The author and the publishers may have tried to produce a learning tool and an invaluable reference as well, a goal which, I believe, they have not fully achieved.
The book's supportive images and statistical diagrams were not of a quality to make them worthwhile, even though they're in colour. In fact, they took more words away than they replaced in the text, a result of "padding the book" in my humble opinion from a SMIL programmer's point of view. For example, the screen shot of Macromedia's Premiere Application was too small and not in colour, where as all the RealNetwork's Real Producer images were much too large and could have been cropped (if nothing else). I found the text repetitious in places, sometimes on the same page, so I found myself skipping blocks of text.
The accompanying CD-ROM should have been more thoroughly tested as there are many errors contained in the example SMIL presentation, the most obvious being that the "<par></par>" and "<seq></seq>" tags were mixed up, so the Company Logo showed and then disappeared before the presentation started. The result: an unhappy client and more time is now required to re-edit, correct and test the presentation again. The source code view of the "How To" example in the book also had no relation to the version on the CD which is more 'complex'. Had Alesso included the different versions of the "How To" example code from the first edit, slowly developing the content in stages by adding more complex edits, this would have given the reader a better understanding and ability to see the presentation come together. The supportive Web site also has similar coding errors and is certainly not up to the standard of other supportive Web sites I have seen. For example I could not find errata for this book anywhere on the site.
Though I may have been disappointed with the SMIL examples, I did find the book to be an adequate resource for information relating to SMIL and other e-Video media. e-Video will serve you well as an invaluable reference, or overview, when developing synchronised Multimedia projects, so long as the reader understands that they will need to debug the examples found in the book. - Glenn Dalgarno
David N. Blank-Edelman.
Perl for System Administration: Managing Multi-platform Environments with Perl.
Sebastopol, Calif.: O'Reilly, 2000.
paper, 444 p., ISBN 1-56592-609-9, US$34.95.
One mark of a good writer is the ability to make an arcane, difficult and potentially dull topic interesting. David Blank-Edelman has this ability. The author's expertise is clear from the fact that explanations are easy to follow, even when the issues are deeply technical. To add interest, Blank-Edelman uses solutions to the problems he tackles to introduce the reader to new programming techniques. For example, in the first chapter, on filesystems, the concept of recursive code is introduced and exemplified by a Perl subroutine to scan a directory recursively for core-dump files.
The book comprises nine chapters, each covering a different aspect of system administration: filesystems; user accounts; user activity; TCP/IP name services; directory services; SQL databases; electronic mail; log files; and security and network monitoring. These are followed by five excellent tutorial appendices on the Revision Control System (RCS), LDAP, XML, SQL and SNMP. Each of the three main operating systems in current commercial use, Unix, Windows NT/2000 and MacOS, is treated equally well: one of the reasons for choosing Perl for certain tasks is that code is similar and (almost) (in theory) (further qualifications) portable to each operating system.
I shall not describe every chapter in the book but rather point out some of the more striking features.
An XML User Account Database
The chapter on user accounts starts with the details of user identity under Unix and Windows NT/2000 (but not MacOS). The remainder of the chapter is devoted to the creation of an account database to manage users. In a boxed section entitled "Why Really Good System Administrators Create Account Systems", the author argues well for going beyond the built-in OS tools for user account management to a more sophisticated database model. The database need not involve a sophisticated SQL server system: it could consist of flat text files. To illustrate this concept he describes in detail how to use XML for a user account database.
TCP/IP Name Services
There is a boxed section in this chapter called "Got "System Administration Database" Religion Yet?" If users' account details can be handled using a database, so can host details. Configuration files created by a script from a database will be far less likely to contain fatal typos than ones created by human fingers. This is system administration 'best practice' and the details are explained in this chapter. The most detailed example is that of a database for generating DNS configuation files (Full code listings are given, although a CD-ROM is not supplied.). Once DNS is configured, the network's health has to be monitored, and again there are code examples to cover this. NIS and WINS are mentioned but are not covered in such detail. NIS+ is ignored because its use is 'marginal'.
These come as an added bonus. You don't need to be a guru of these protocols and languages to be a successful system administrator, but a certain level of knowledge is vital. The appendices focus on the essentials and are superb examples of accurate, concise, well-targeted writing.
Once again, O'Reilly has lived up to its reputation of excellence, and I cannot find any major points to criticise without being unnecessarily churlish. One can only hope that every computing text were as packed with usefulness as this book. - Robert Scovell-Lightfoot
XML for the World Wide Web.
Berkeley, Calif.: Peachpit Press, 2000.
paper, 270 p., ISBN 0-201-71098-6, US$19.99.
O'Reilly & Associates: http://www.peachpit.com
I was not particularly impressed when this slim guide landed in my lap. How could its 270 pages contain enough information to justify XML - Extensible Markup Language - a language for creating other languages. But I was pleasantly surprised at how well the publishers Peachpit Press and the author Elizabeth Castro have managed such a task. In so few pages, Castro explains the purpose of XML markup itself, the CSS and XSL styling languages, along with the XLink and XPointer specifications for creating rich link structures. The guide is aimed at budding and professional designers, programmers and writers needing to get up to speed with this rapidly developing language and with the tough learning curve involved. Coverage of advanced XML topics is included, which provides essential direction for more complex XML solutions including an overview of the related core technologies.
The book has been published just when the XML Standards are all becoming "recommended standards" as well as being supported more and more by those developers currently releasing products and services for today's market. Elizabeth Castro clearly explains that there is still some way to go with the adoption of XML as a supported standard, especially in the context of the major browsers. Opera is becoming more supportive of the standard and will hopefully be compliant with its next release. Until there is better support, developers will still need to employ third-party applications to interpret XML documents.
Divided into six clear and logical sections, not including the Appendices, the book starts out with part 1: "XML" through to part 2: "DTDs" moving onto part 3: "XML Schema and Namespaces" and into part 4: "XSLT and XPath inline" to part 5: "Cascading Style Sheets", and finally surfaces in part 6: "XLink and Pointer". The sections are then further subdivided into 16 segments denoted by each chapter heading, giving the reader an excellent overview of XML and the ability to delve in here or there to find relevant material quickly. This book presents the key elements of the technology with enough detail to familiarise the reader with this crucial markup language; it is clearly written and incredibly informative. To take full advantage of its power, you have to combine XML with its core partner technologies: XML Schema, DTDs, XSLT and XPath, CSS, XLink, and XPointer. Elizabeth Castro's explanation of the building blocks of XML allows the reader to create and use XML-based documents and solutions, which in turn are helping XML to become firmly entrenched in today's data management issues relating to the Internet.
For writers producing content for XML documents, this book will demystify the schemas and the processes of creating them with the appropriate structure and format. Web designers will learn what parts of XML are most helpful to their creative and technical team and will quickly get started on creating Document Type Definitions. For programmers the book makes the syntax, rules and structures crystal clear. The guide also covers the use of Cascading Style Sheets needed for displaying documents with the next generation of browsers, databases, and other devices. The supporting Web site, http://www.cookwood.com/xml/, contains a lengthy errata (many of which, though, are not major errors but cosmetic details) so have your pen handy when you visit. The site has links to the numerous examples contained within the guide.
Once you have learnt XML and are familiar with all the intricacies, you will probably not use this book as a reference guide. On the other hand, during the learning phase it's an excellent resource. The supportive images throughout the book are each worth their weight in gold. Highly recommended. - Glenn Dalgarno
Networking Linux: A Practical Guide To TCP/IP.
Indianapolis, Ind.: New Riders, 2001.
paper, 404 p., ISBN 0-735-71031-7, US$39.99.
New Riders: http://www.newriders.com
We all know that the Internet can be regarded as an excellent communication medium. It can really be a powerfully effective way of transmitting data between computers on WANs and LANs. However, unless you know how to exploit this medium your efforts can fall short of their full potential. The publication of Networking Linux: A Practical Guide To TCP/IP represents without doubts a welcome addition to the technical literature. New Riders' "Landmark" series of guide books are fast becoming the network administrators best friends. This guide is an ideal reference and "how to" manual for the aspiring professional network administrator always busy maintaining Wide Area Networks and/or Local Area Networks, both big and small; it is even useful for the inspired home-based networker or student, of whom there are now many practitioners calling ourselves Administrators. I am reluctantly putting this very "Practical Guide" down to force myself to write this review.
Pat Eyler writes from first-hand experience and has done a tremendous task of making the information in the TCP/IP Guide accessible to "inspired network administrator" like myself. In my opinion, what sets this book apart from the rest is the combination of friendly writing style with the typical layout and format of all New Riders "Landmark" books. It is a welcome break from the dry writing style of many manuals with little or no supporting graphics. In this book, a clear and easy to read font and layout are combined with simple and effective graphics to help a reader understand technical information.
The book has been well organised, divided into three main parts: "Protocols", "Using the Protocols Effectively", and "Tools for your Toolkit", plus an Appendix which is truly a gem. The book's introductory first chapter gives an overview of the history of the TCP/IP stack development for Linux from the first ever IP stack "NET-1" - written by Ross Biro, to the current IP stack "NET-3" which is the result of a group effort that makes the Linux distro "incredibly solid, fast and easy!". If you are just starting to deal with administering a network, the author invites you to begin here. The more experienced individuals can dive straight into the Requests For Comments such as RFC 1149 which is one of the few less-than-serious RFCs published.
Part One, "Protocols", provides a more than thorough look at the link-layers, the structure and usage of Packets and how they 'interoperate', in order to make a network communicate. Part Two, "Using the Protocols Effectively", is an excellent hands-on guide to troubleshooting network problems while Chapter 6, "A Problem-Solving Pattern", equips the administrator with an effective error-checking routine that get quickly to the root of network problems. There are many case studies providing insights into real world solutions and a chapter devoted to network baselining. Part Three, "Tools for your Toolkit", is a collection of necessary tools for every networking administrator and covers installation and employing of tools for troubleshooting, monitoring and securing a network. The tools are covered by the Open Publication License (OPL). There is an online support site for this section at http://www.networkinglinuxbook.com
The strengths and weaknesses of each one of the TCP/IP Protocols are covered in great detail, together with focused advice on how each of the protocols will match when communicating on networked WANs or LANs. This is an essential reference if you are inheriting or creating a network, or if you are administering the integration of many or additional networks. With the rapid expansion and development of new Web technologies it is now more important than ever to streamline your network performance to provide an acceptable level of service to the end user. Networking Linux: A Practical Guide To TCP/IP achieves this with no noticeable learning curve, the information just seeps out of the book and onto your network! The Administrator's New Friend! -- Glenn Dalgarno
Stephen Northcutt and Judy Novak.
Network Intrusion Detection: An Analyst's Handbook.
Indianapolis, Ind.: New Riders, 2000.
paper, 480 p., ISBN 0-735-71008-2, US$45.00.
New Riders: http://www.newriders.com
When it comes to security most companies invest in a firewall with the assumption that this will adequately protect their internal network. However, most of the time, this turns out to be an overly optimistic attitude. An intrusion detection system gives early warnings that someone is probing a network by detecting unusual behaviour or malformed packets.
This book is an outstanding introduction to intrusion detection and shows how vulnerable TCP/IP is to abuse. It is a fully updated edition and, if you have owned the previous one, you will notice that many sections have been extensively rewritten and updated.
The first several chapters do an excellent job of covering the networking protocols and their underlying technologies and functionality: DNS, ICMP, fragmentation, IP concepts and tcpdump are all explained in detail. The examples chosen are taken from the output of tcpdump itself so that, for the unfamiliar reader, it might take some getting used to, in order to understand all the information.
Further on, the book proceeds to discuss various IDS systems and the types of attacks that can be detected; it concludes with the business case for ID. Here it would have been appropriate to include information about "snort", which is the free IDS system. It is probably the only criticism I can make.
Stephen Northcutt is an excellent author who injects his personal life into his work, making it a very interesting and readable technical book. I would recommend Network Intrusion Detection to anyone interested in networks or security. It is a first-rate resource, well-written and full of useful information. - Richard Gale
Robert L. Park.
Voodoo Science: The Road from Foolishness to Fraud.
Oxford: Oxford University Press, 2000.
cloth, 230 p., ISBN 0-198-50745-3, GB£18.99.
Oxford University Press: http://www.oup.com
Pick up a copy of this book and one of the first things you'll notice on the cover is a snippet by Richard Dawkins, Professor for the Public Understanding of Science at the University of Oxford, which states: "I finished this brilliant book within a day and then felt such withdrawal symptoms I went right back to the beginning and started again." It might, at first, seem a bit of an exaggeration, however, this is nearly what happened to me, after I finished Voodoo Science. And the 'nearly' bit is because I simply did not have the time to re-read it straight away.
The book's main theme is the interplay between scientific discoveries and the traditional peer-reviewed process (or the lack thereof). Science can reward immensely those who deal with it. A major breakthrough is the most desired prize for researchers and scholars in fields such as physics, chemistry, medicine and technology; in a sense, then, it is understandable that the lure of scientific success and worldwide recognition can have the power to undermine a honest and objective discussion of the specifics involved. It is precisely here that peer-review comes into play. As Park writes:"The success and credibility of science are anchored in the willingness of scientists to obey two rules:
- Expose new ideas and results to independent testing and replication by other scientists.
- Abandon or modify accepted facts or theories in the light of more complete or reliable experimental evidence."
Unfortunately, when these rules are broken there is a real danger that an ingenuous error will evolve from self-delusion to fraud. Voodoo Science presents a series of such cases, where the original discovery was not subjected to testing by disinterested parties, in order to assess their validity. Alleged breakthroughs, such as Cold Fusion, the Roswell Incident of 1947, homeopathic medicines, and the multitude of contraptions that claimed to produce more energy than they consumed are all ideal scenarios for a focused and objective analysis by the author. Park examines the cases with lucidity and competence, revealing how extreme self-conviction and deliberate deceit combined with ignorance from part of the public and from the non-scientific community can foster 'pathological science', 'pseudoscience' and 'junk science', all of which he groups under the term 'voodoo science'.
The solution to the problem? Raising the level of scientific literacy by disseminating the notion that we all live in an orderly universe governed by natural laws that cannot be changed or sidestepped at will by humans. If the general public had the opportunity to assess scientifically what is presented to them, it would be more difficult to promote sensationalistic discoveries or those cooked up merely to increase the financial benefits of whoever is associated with it.
The reality, however, is that "most of us wind up with beliefs that closely resemble those of our parents and community. Society, in fact, often holds it to be a virtue to adhere to certain beliefs in spite of evidence to the contrary." Voodoo science is a product of our own attitude towards what is new and difficult to grasp, towards something which we might not comprehend unless accurate, unbiased information is passed onto us.
In this respect, what the book highlights is the direct and negative impact that modern media coverage has had (and has) on our perception of revolutionary inventions or scientific advancements."The distinction between untested (or even untestable) speculation and genuine scientific progress is often lost in media coverage. The confusion is reinforced by the scientists themselves. They are eager to tell people what it's like on the frontier. They want to talk about neutrino oscillations, Higgs bosons, cosmic inflation, and quantum weirdness - the things that excite them. And of course they should - this is part of the human adventure - but in doing so they cannot resist pandering to the public's appetite for the 'spooky' part of science."
Park brings us again and again the image of "talking heads", so-called experts, usually identified only by a brief caption, who appear in front of the camera making short, inconclusive comments on the innovation being presented: one cannot deny that television promotes such spectacles because they are good for ratings and for no other reason.
Voodoo Science is definitely one of the most enjoyable books I've read for ages. It is a real treat; its language is clear and unpresumptuous, its content should be mandatory reading for the general public, especially the young. At a minimum, it should be used as an antidote against misinformation overload. - Paolo G. Cordone
Ellen Siever, Stephen Spainhour, Jessica P. Hekman, and Stephen Figgins.
Linux in a Nutshell.
Sebastopol, Calif.: O'Reilly, 2000.
paper, 812 p. with CD-ROM, ISBN 0-596-00025-1, US$34.95.
If you were to walk into a PC supplier five years ago, I would guarantee that there would only be one operating system on the shelves. Today you will see a number of different versions of the Linux operating system alongside the latest incarnation of the same offering. Linux has grown up from the realm of the "geek" downloading the kernel from the Internet, to becoming a teenager with attitude playing alongside the big boys. Even IBM has taken it under their wing and you can run an S/390 mainframe under Linux if you so desire.
People are now buying Linux, and others are getting it for free from a variety of sources. It doesn't however come installed on a PC direct from a supplier. If you want to run it, you are going to have to configure your PC; you are going to have to install and configure Linux, and you are going to have to manage more things in the background compared to Windows. This all means that, unless you have been working with UNIX for a while, you will need training or alternatively find a good book as a guide. Linux in a Nutshell would be high on my list of choices for that book.
It starts out with two useful chapters that list the common Linux commands and the administrative commands in logically organised sections. This is particularly useful as Chapter 3 contains the man page information in alphabetic order. In all honesty, I would have preferred to see this chapter broken down into a number of smaller chapters covering the commands in related areas, for example printing related commands, networking commands, file system management, etc. Most of this chapter is available as online manual pages once you have got your system up and running and once you know how to run the man command then you can get at the information. It would be more useful to know what command you actually need to carry out a particular job.
The next chapter looks at configuring the boot loader to allow Linux to be booted. I have always found LILO to be particularly irritating to configure, and this chapter helped me quite considerably when trying to get Linux to live alongside Windows NT. From there the book moves onto package managers and gives a basic overview of the common managers available within Linux. This will be more than enough to allow a new user to add new packages into their installation.
The next chapters look at shells, and particularly bash, tcsh and csh. It is impossible to do justice to the features available in any of these shells, particularly the scripting and programming capabilities offered by each of them. There is however enough information there to get started with, and in conjunction with the later chapters on sed and gawk to allow a new user with a degree of exposure to programming to develop reasonably powerful scripts.
The remaining chapters cover some of the editors available, namely emacs and vi, revision control utilities RCS and CVS, and finally, the Gnome, KDE and fvwm2 window managers. The book is meant to be a quick reference, but I find it quite comprehensive. Indeed if this is a nutshell, then the nut would have to feed a family of four! Peter Scott
PHP for the World Wide Web.
Berkeley, Calif.: Peachpit Press, 2001.
paper, 278 p., ISBN 0-201-72787-0, US$19.99.
Peachpit Press: http://www.peachpit.com
PHP, or "Personal Home Page" as named by its creator Rasmus Lerdorf in 1994, was developed to record and log the activity of visitors to his online résumé. Today it is known as "Hypertext Pre-processor" and is a server-side, cross-platform, HTML-embedded scripting language. Server-side refers to the fact that all PHP's actions reside on the server, cross-platform means that PHP can be used on any platform, from OS/2 to Windows NT, from UNIX to the Mac OS and other operating systems. Finally, HTML-embedded means that the PHP code can be included within HTML documents, and a scripting language due to PHP being designed to do something only after an event occurs.
Peachpit Press have a reputation for producing first rate computing books and they have again surpassed themselves with this guide. Each page is accessible, well laid out, full of detail, as well as the usual tips and tricks. Larry Ullman manages to bring the reader easily through the book's 14 chapters, covering some of the most important fundamentals, from using numbers, strings, control structures, arrays, regular expressions, to creating functions, using files and directories effectively, to understanding database connectivity, cookies, Web applications and the required reading of all programmers, budding or not: script debugging.
The Visual Quickstart Guide to PHP for the World Wide Web conveys a solid understanding of the PHP language fundamentals, though is not entirely an in-depth and comprehensive reference. It provides the knowledge to start building dynamic Web sites and Web-based applications today. I have created my own PHP applications as described and shown by the PHP code examples from the book and have achieved very satisfactorily results. The appendix is not massive but contains "how-to's" on installing PHP onto either a UNIX or Win2000 server. In this respect it would have been nice if there had been a bit more information regarding the options which are available during the installation of the PHP module. There are links, with descriptions, of the main PHP resources available on the World Wide Web and, although not completely exhaustive, they are more than adequate for the reader's first steps.
PHP and HTML are covered within the various chapters, so even if you don't know much about creating forms in HTML, you will be able to follow the examples within the book. You will also need a server with PHP3/4 installed, or at least an IP account that will handle PHP3/4 requests, so as to be able to follow, write and run the examples in the book, even if you have downloaded them from the support site. The reader will not end up out of their depth while new methods, quirks and concepts are introduced; instead the author gives plenty of encouragement to try the examples and experiment. The PHP examples are well commented, a practice Ullman strongly recommends. You may know what and why you created an application one day, but a few months down the line you may need your memory jogged! Despite the fact that they are not particularly cutting-edge and complex, they will provide you with a better understanding of the PHP language, also thanks to the effective and supporting images and graphics which back up the text.
By rewriting the examples you become more receptive and therefore able to pick up the quirks of the language, retaining them longer in your head. Starting out with the infamous "Hello World!" example this guide will help you develop handy modules that can be incorporated into your existing Web site(s), so as to create more dynamic pages for your audience.
If you're a Web developer or programmer and just want to tinker with PHP, but you do not know enough about PHP, this book is aimed at you and is very reasonably priced. If you're a beginner and haven't done much with PHP but would like to learn, the Visual Quickstart Guide to PHP would also be an excellent starting point. However, if you're already accustomed to PHP and only want to know about the new features of PHP4, there is little to wet your appetite, and you would be better spending your time researching on the Internet. - Glenn Dalgarno
Copyright ©2001, First Monday