Posts Tagged ‘Charles Edge’

MacSysAdmin 2012 Slides and Videos are Live!

Thursday, September 20th, 2012

318 Inc. CTO Charles Edge and Solutions Architect alumni Zack Smith were back at the MacSysAdmin Conference in Sweden again this year, and the slides and videos are now available! All the 2012 presentations can be found here, and past years are at the bottom of this page.

Google Acquires Zagat

Thursday, September 8th, 2011

Yelp. They came onto the online scene fast, and have since become the way many of us find restaurants when in foreign lands (or even our own back yard). They even ended up doing so well that Google tried to acquire them for half a billion dollars in 2009. But when you’re hot, you’re hot, and they decided to continue on their own path.

Zagat, a classic company founded way back in the 1970s is kind of the Gold Standard of restaurant reviews. I remember using Zagat to find restaurants in Rome back in the early 1990s (couldn’t quite afford to eat at a place with a 30 rating back then). And their review guides are great. In 2008 they put themselves up on the auction block and summarily took themselves right back down. At that point in time, Zagat would have cost a cool $200 million. A steal compared to upstart Yelp. But while a company with a lot of content, not really a company with a lot of content freely available on the web – which seems to be the name of the game these days.

While Google doesn’t own Yelp, they still want user-generated reviews. Google announced on their blog today that they’re buying Zagat. This move isn’t just about user-generated reviews though, it’s about content. Zagat has 30+ years worth of content, much of which dates back to the manual form of user-generated reviews.

If you look at Google’s most recent acquisitions, many involve coupons, social media, price comparisons, gaming, travel and who can forget a big-daddy of content in YouTube. All that Google needs to do is buy Wikipedia and they’d own a huge chunk of the content out there, or at least they’d own enough to point you to their chunk. The moves only make sense. Try running a define search (e.g. define: Google). Notice that rather than all of the links be hits on other sites, the first is now an actual definition. Clicking More>> brings up the Google Dictionary, not Wikipedia. And in some cases, that dictionary entry is basically the only thing (YMMV).

Google changed the game when it comes to how people find things. They’re in the process of changing that game again. How can you capitalize on these changes? This is going to be different for everyone, but your 318 Professional Services Manager will be happy to discuss strategies for social media, online strategies and the new king of the online world, content. If you do not yet have a Professional Services Manager, please contact 318 at 310-581-9500 or sales@318.com for more information!

CIO: An Interview with Charles Edge on iPad 2

Friday, March 4th, 2011

Charles Edge, the Director of Technology for 318 was interviewed recently by CIO magazine, shortly after the announcement of the iPad 2. In the interview, enterprise viability of iPad 2 and a number of other items around iOS in the enterprise were discussed.

See the full article here:
http://www.cio.com/article/672117/Do_iPad_2_iOS_4.3_Make_Enough_Gains_for_Enterprise_?source=rss_news

eWeek Article Featuring Charles Edge

Wednesday, January 5th, 2011

318 is in the news yet again. This time as the central figure in an article from eWeek entitled How Influx of iPhones, iPads Impacting Enterprises. The article is available at http://www.eweek.com/c/a/Mobile-and-Wireless/How-Influx-of-iPhone-iPads-Impacting-Enterprises-582284 and focuses on, as the title references, what enterprises are to do with the infiltration of the iPad and iPhone. While the article is specifically geared towards Apple-based devices, the ideas can be used for any other platform as well. In the article, Chris Preimesberger interviews the 318 Director of Technology, Charles Edge and provides a number of answers to some specific questions that enterprises come to the table with when they approach the Apple platform.

If you are adopting Apple into your enterprise, you may have even more questions that need answering. If so, please feel free to contact your 318 Professional Services Manager or sales@318.com if you do not yet have one.

Book On Enterprise iOS Integration Available

Monday, December 20th, 2010

The 6th book from 318′s staff is now available: Enterprise iPhone and iPad Administrator’s Guide. In this title, Charles Edge, the Director of Technology at 318, takes a look at lessons learned in our numerous iOS integration projects, from procurement to deployment to patch management. Per the publisher, Apress, the following indicates who the book is intended for:

This book is intended for IT staff members that will be charged with planning an iPhone and ipad implementation or pilot program, as well as those that will be charged with ultimately deploying and provisioning the devices and delivering support to iPhone and iPad users. Readers should have an existing background in IT management, systems administration, and end user support working in a medium to large business or enterprise environment.

If you are considering doing a large scale integration or remediation project for iOS-based devices in your environment then contact your 318 Professional Services Manager or sales@318.com for more information on how 318 can assist you in your endeavors.

PresSTORE Article on Xsanity

Tuesday, November 16th, 2010

We have posted a short article on the availability of PresSTORE 4.1 on Xsanity at http://www.xsanity.com/article.php/20101116105720183. Enjoy!

Visit our booth at Macworld 2010

Thursday, February 11th, 2010

Come visit our booth at Macworld 2010 on the expo floor. We are located in Booth 566C and have a bunch of free schwag to give out.

We also have a number of sessions this year:

Hands-on Snow Leopard Server: Collaboration Services with Charles Edge
2/10 – 1:00PM to 3:00PM

Push: The Next Generation of Collaboration is Snow Leopard Server with Charles Edge
2/11 – 4:30PM to 6:00PM

Advanced Integration with Final Cut Server with Beau Hunter
2/12 – 3:30PM to 5:00PM

iPhone Mass Deployment with Zack Smith
2/13 – 2:30PM to 4:00PM

We hope to see you there!

318 & MacWorld 2010

Thursday, September 24th, 2009

318 is proud to announce that we will have 3 speakers doing a total of 4 sessions at the upcoming MacWorld Conference & Expo in San Francisco in February. Speakers will be Beau Hunter, Zack Smith and Charles Edge.

We will also be announcing some events as the conference gets closer. If you are planning to attend then you can sign up here. We hope to see you there!

Xsanity article on Configuring Network Settings using the Command Line

Tuesday, February 10th, 2009

We have posted another article to Xsanity on “Setting up the Network Stack from the Command Line”. An excerpt from the article is as follows:

Interconnectivity with Xsan is usually a pretty straight forward beast. Make sure you can communicate in an unfettered manner on a house network, on a metadata network and on a fibre channel network and you’re pretty much good to go. One thing that seems to confuse a lot of people when they’re first starting out is how to configure the two ethernets. We’re going to go ahead and do two things at once, explain how to configure the interface and show how to automate said configuration from the command line so you can quickly deploy and then subsequently troubleshoot issues that you encounter from the perspective of the Ethernet networks.

View the full article here.

Xsanity Article on Managing Fibre Channel from the Command Line

Friday, January 23rd, 2009

We have posted another article on Xsanity. This one is on managing Fibre Channel settings from the command line. The following is an excerpt from the article:

Once upon a time there was Fibre Channel Utility. Then there was a System Preference pane. But the command line utility, fibreconfig, is the quickest, most verbose way of obtaining information and setting various parameters for your Apple branded cards.

To get started with fibreconfig, it’s easiest to start with just asking the fibreconfig binary to simply display all the information available on the fibre channel environment. This can be done by using the –l option as follows:

View the full article here.

Xsanity Article on Labeling LUNs from the Command Line

Monday, January 19th, 2009

We have published another article on Xsanity. This one on using removable media with Xsan. More importantly this article shows how to label a LUN using the command line tool cvlabel. An excerpt is as follows:

Sometimes you just need a small SAN for testing… Times when you don’t have fibre channel and you don’t need to provide access to multiple clients, like maybe if you’re writing an article for Xsanity on an airplane. Now this volume is not going to be supported (or supportable) by anyone and nor should it be (so don’t use it for real data), but you can use USB and FireWire drives for a small test Xsan…

View the full article here.

Article on Xsanity – Linux + Xsan

Tuesday, January 13th, 2009

After a long silence on Xsanity, 318 has published the first of a number of articles for the site. The article focuses on how to install and configure StorNext clients running Red Hat Enterprise Linux (RHEL) to connect to an Xsan. It is available here.

A Brief History of Cryptography

Tuesday, October 23rd, 2007

Cryptology is derived from the Greek words kryptos, which stands for “hidden” and grafein, which stands for to “write”. Through history, cryptography has meant the process of concealing the contents of a message from all except those who know the key. Cryptography is used to protect e-mail messages, credit card information, and corporate data. Cryptography has been used for centuries to hide messages when they are submitted through means where they might be intercepted, such as the Internet.

But encrypting email messages as they traverse the Internet is not the only reason to understand or use various cryptographic methods. Every time you check your email, your password is being sent over the wire. Many ISPs or corporate environments use no encryption on their mail servers and the passwords used to check mail are submitted to the network in clear text (with no encryption). When a password is put into clear text on a wire it can easily be intercepted. This is especially dangerous when you are on the road, at hotels, on wireless hotspots, or at an internet café. However, it is often simple to also obtain another users password for email, payroll systems and file servers while at work and on the same network. Applications such as WireShark, Ethereal and many others and have existed for a long time and are now fairly advanced, allowing the user to possibly replay the password or a stream of packets that resemble credentials to a server in order to gain entry.

To aid in protecting communications between computers, there are a wide variety of cryptographic implementations in use. They are typically provided for one of two reasons: to protect data on the computer or to protect data as it is being transferred. Most cryptographic techniques rely heavily on the exchange of cryptographic keys.

Symmetric-key cryptography refers to encryption methods where both senders and receivers of data share the same key and data is encrypted and decrypted with algorithms based on those keys. The modern study of symmetric-key ciphers revolves around block ciphers and stream ciphers and how these ciphers are applied. Block ciphers take a block of plaintext and a key, then output a block of ciphertext of the same size. DES and AES are block ciphers. AES, also called Rijndael, is a designated cryptographic standard by the US government. AES usually uses a key size of 128, 192 or 256 bits. DES is no longer an approved method of encryption triple-DES, its variant, remains popular. Triple-DES uses three 56-bit DES keys and is used across a wide range of applications from ATM encryption to e-mail privacy and secure remote access. Many other block ciphers have been designed and released, with considerable variation in quality.

Stream ciphers create an arbitrarily long stream of key material, which is combined with a plaintext bit by bit or character by character, somewhat like the one-time pad encryption technique. In a stream cipher, the output stream is based on an internal state, which changes as the cipher operates. That state’s change is controlled by the key, and, in some stream ciphers, by the plaintext stream as well. RC4 is an example of a well-known stream cipher.

Cryptographic hash functions do not use keys but take data and output a short, fixed length hash in a one-way function. For good hashing algorithms, collisions (two plaintexts which produce the same hash) are extremely difficult to find, although they do happen.

Symmetric-key cryptosystems typically use the same key for encryption and decryption. A disadvantage of symmetric ciphers is that a complicated key management system is necessary to use them securely. Each distinct pair of communicating parties must share a different key. The number of keys required increases with the number of network members. This requires very complex key management schemes in large networks. It is also difficult to establish a secret key exchange between two communicating parties when a secure channel doesn’t already exist between them.

Whitfield Diffie and Martin Hellman are considered the inventors of public-key cryptography. They proposed the notion of public-key (also called asymmetric key) cryptography in which two different but mathematically related keys are used: a public key and a private key. A public key system is constructed so that calculation of the private key is computationally infeasible from knowledge of the public key, even though they are necessarily related. Instead, both keys are generated secretly, as an interrelated pair.

In public-key cryptosystems, the public key may be freely distributed, while its paired private key must remain secret. The public key is typically used for encryption, while the private or secret key is used for decryption. Diffie and Hellman showed that public-key cryptography was possible by presenting the Diffie-Hellman key exchange protocol. Ronald Rivest, Adi Shamir, and Len Adleman invented RSA, another public-key system. Later, it became publicly known that asymmetric cryptography had been invented by James H. Ellis at GCHQ, a British intelligence organization and that both the Diffie-Hellman and RSA algorithms had been previously developed. Diffie-Hellman and RSA, in addition to being the first public examples of high quality public-key cryptosystems are among the most widely used.

In addition to encryption, public-key cryptography can be used to implement digital signature schemes. A digital signature is somewhat like an ordinary signature; they have the characteristic that they are easy for a user to produce, but difficult for anyone else to forge. Digital signatures can also be permanently tied to the content of the message being signed as they cannot be ‘moved’ from one document to another as any attempt will be detectable. In digital signature schemes, there are two algorithms: one for signing, in which a secret key is used to process the message (or a hash of the message or both), and one for verification, in which the matching public key is used with the message to check the validity of the signature. RSA and DSA are two of the most popular digital signature schemes. Digital signatures are central to the operation of public key infrastructures and to many network security schemes (SSL/TLS, many VPNs, etc). Digital signatures provide users with the ability to verify the integrity of the message, thus allowing for non-repudiation of the communication.

Public-key algorithms are most often based on the computational complexity of “hard” problems, often from number theory. The hardness of RSA is related to the integer factorization problem, while Diffie-Hellman and DSA are related to the discrete logarithm problem. More recently, elliptic curve cryptography has developed in which security is based on number theoretic problems involving elliptic curves. Because of the complexity of the underlying problems, most public-key algorithms involve operations such as modular multiplication and exponentiation, which are much more computationally expensive than the techniques used in most block ciphers, especially with typical key sizes. As a result, public-key cryptosystems are commonly “hybrid” systems, in which a fast symmetric-key encryption algorithm is used for the message itself, while the relevant symmetric key is sent with the message, but encrypted using a public-key algorithm. Hybrid signature schemes are often used, in which a cryptographic hash function is computed, and only the resulting hash is digitally signed.

OpenSSL is one of the main applications used in Linux and Mac OS X to access the various encryption mechanisms supported by the operating systems. OpenSSL supports Diffie-Hellman and various versions of RSA, MD5, AES, Base, sha, DES, cast and rc. OpenSSL allows you to create ciphers, decrypt information and set the various parameters required to encrypt and decrypt data.

THIS ARTICLE IS A REPRINT FROM: Foundations of Mac OS X Security, from Apress Written by Charles Edge, William Barker and Zack Smith of 318

SANS Mac OS X Fundamentals Now Avaliable

Tuesday, August 21st, 2007

The SANS Institute recently released a course by Charles Edge on Mac OS X Security Fundamentals. The course is described in the following manner:

“SANS is the leader in Information Security. This course on securing Mac OS X is the fastest way and most comprehensive way to get up to speed on applying the principals of the information security industry to the Mac. Written and taught by one of the security veterans of the Mac community, this course covers how real world security concepts are applied to the Mac with real world examples from the Mac community. The course offers a balanced mixture of technical issues making it appealing to attendees needing to understand how to effectively secure a Mac.

We begin by reviewing existing Mac exploits and then move on to covering the basic concepts and challenges of securing a Mac. Next, we review the standard security measures that should always be employed and the usability implications of each. We cover forensics, intrusion detection, firewalls, web browsers, mail programs, network infrastructure, preferences, system policies, command line tools, encryption, hardware and OS X Server. Through the course you will find thorough coverage of defense in-depth on the Mac platform.

If you’re a newcomer to the field of information security but a long time user of the Mac or a newcomer to the Mac but a long time information security expert then this is the course for you. You will develop skills that will help you to bridge the gap between the Mac administrators and the security administrators in most organizations. You will also learn the ins and outs of keeping your data private.

This is an ideal course for anyone charged with securing Mac systems. From securing a desktop to the high availability options available on the platform, this course is going to be a whirlwind overview of the Mac that will leave you ready to move to the next level!”

For more information on the course, see the following link:

https://www2.sans.org/staysharp/description.php?tid=1492

318 Kerio Migration Article

Saturday, June 16th, 2007

Article about 318 on Apple.com, focusing on a project we did integrating Kerio to replace Microsoft Exchange, giving our client the ability to centralize all of their server assets into an Open Directory environment while still using MAPI to provide groupware components to their user base, have handheld devices that sync with their Calendar/Mail/Contacts and of course, use the standard Exchange features of mail, etc. Good stuff:

http://consultants.apple.com/at_a_glance/318inc/

Sherlock – The Forgotten Mac Program

Monday, February 26th, 2007

Just last week, I was in the midst of celebrating my birthday. It was more or less a camping trip and, like any true geek, I brought all my techie goodies with me just in case.

I had my Laptop, networking/FireWire cables, digital camera, AC inverter (so that my car could charge all my devices), and rechargeable batteries. You name it, I brought it and they all came in quite handy on the trip too. When I filled my CF card from my new Nikon D70 digital camera, my laptop was there to download the pictures and burn a CD backup just in case. When my camera batteries got low, I used my AC inverter, powered by the car, to charged my batteries and again, when I needed to check my email, my laptop connected through my cell phone to the net and I was able to stay connected to the outside world. All in all, I was prepared for anything, or so I thought.

As we were driving through Death Valley, miles away from any cell phone reception and further from any signs of civilization and the technological world, you can imagine how surprised I was when we came upon a broken down car. I slowed and signaled to the driver who was waiving me to pull over and help him. When I asked what the matter, I received the reply, “Non parlo inglese.”

After a few minutes of carefully planned gesticulation I learned that my two Italian friends, Mateo #1 and Mateo #2, were on their way to San Francisco when they hit a rock which smashed their oil pan and stranded them. There was an enormous language gap and most of our communications consisted of one word sentences such as “hungry?” and “hotel?” with the occasional compound 3 word sentence as in “what your email address?”

So you may be asking yourself what this little story has to do with technology so here it is: SHERLOCK by APPLE.

Fast forward 3 weeks and I’m home, about to go see a movie. Naturally, I opened up Sherlock to check the movie time and the translation button caught my eye. Translation button? I opened it up and realized that every Mac has a built in language translator ready to go with the 11th option from the top being “English to Italian.”

Here I am, a techno savvy computer dude and this most basic feature eluded me for years. If I would have known about it 3 weeks ago it would have made our rescue mission just a bit easier and allowed us to get to know our Italian friends a little more. On top of doing language translations, Sherlock can look up movie times w/ QuickTime previews, stock quotes, picture searching, yellow pages lookups and a lot more.

Sherlock, it’s back in my dock.

serveradmin in OS X

Monday, November 20th, 2006

Mac OS X Server is a strange beast. It has the ability to cause you to
think it’s the greatest thing in the world in that you can do all kinds of
complicated stuff quickly through a nice GUI. It can also dismay many of us
who know where Unix-specifics live in the OS and would prefer to configure
things there. So, where are all those settings that override so many of
the default Unix configuration files? Serveradmin is a command that gives
access to much of what you see in Server Admin and much of what you don’t.

Serveradmin use starts out with viewing data on a specific service. For
example, type sudo serveradmin fullstatus vpn and see a full status on the
settings and use of the vpn service. Or issue an sudo serveradmin settings
ipfilter command and see the settings applied to the firewall service. To
see all of the services you can configure and view type sudo serveradmin
list. Then look at doing a serveradmin start afp followed by a serveradmin
stop afp. Suddenly you are stopping and starting services on a server using
the command line, meaning you can actually issue these over an SSH session
rather than having to use ARD to connect. This can become invaluable when a
bad firewall rule locks you out of the Server Admin tool. Just issue a
serveradmin stop ipfilter and you’re right back in!

You can also set settings that aren’t available in the GUI. For example,
look at VPN. Let’s customize where we put our logs. First, type in sudo
serveradmin settings vpn. Now, look for the following entry:
vpn:Servers:com.apple.ppp.pptp:PPP:Logfile = “/var/log/ppp/vpnd.log”

To change this setting, let’s type in:
Serveradmin settings vpn:Servers:com.apple.ppp.pptp:PPP:Logfile =
“/var/log/ppp/pptpvpnd.log”

Now the PPTP logs will be stored in a separate location than the logs for
the rest of the VPN service. This couldn’t have been done using a
configuration file, but only using the serveradmin command. Nifty!

Now let’s look at NAT. NAT is cool, but there’s just two buttons: Start and
Stop. So how would we require a proxy for Internet traffic? How about
this:
Serveradmin settings nat:proxy_only = yes

Or we could log denied access attempts using:
nat:log_denied = no

These options aren’t available from the GUI at all. But what really happens
when we’re using these commands? Well, typically a plist file is being
updated. Any time you see a yes or no value then you are looking at a
boolean variable in a plist file. That log_denied variable is also stored
in /private/etc/nat/natd.plist in the lines:
log_denied

Fun stuff! In my book I actually go into a little more detail about
forwarding specific ports to other IP addresses using the NAT service as
well. That too happens in a plist.

Practical ILM for Small Businesses: Information Life Cycle Management (ILM)

Thursday, October 26th, 2006

The amount of data used by Small Businesses is on target to rise 30% to 35% in 2006. Sarbanes-Oxley, HIPPA and SEC Rule 17a-4 have introduced new regulations on the length of time data must be kept and in what format. Not only must data be kept, it must be backed up and secured. These factors have the cost of data storage for the Small Business increasing exponentially.

Corporations valued at more than 75 million dollars are generating 1.6 billion gigabytes of data per year. Small and medium sized companies can reap the benefits of developments being made with larger corporations. Different methods and classifications for data are one of these.

Information Lifecycle Management (ILM) is a process for maximizing information availability and data protection while minimizing cost. It is a strategy for aligning your IT infrastructure with the needs of your business based on the value of data. Administrators must analyze the trade-offs between cost and availability of data in tiers by differentiating production or transactional data from reference or fixed content data.

ILM includes the policies, practices, services and tools used to align business practices with the most appropriate and cost-effective data structures. Once data has been classified into tiers then storage methods can be chosen that are in line with the business needs of each organization. The policies to govern these practices need to be clearly documented in order to keep everyone working towards the same goals.

Storage Classification

Online storage is highly available with fast and redundant drives. The XRAID and XSAN are considered online storage, which is best used for production data as it is dynamic in nature. This can include current projects and financial data. This data must be backed up often and be rapidly restored in the event of a loss. It is not uncommon to use an XRAID to backup another XRAID for immediate restoration of files and a Tape Library to maintain offsite backups of the XRAID.

Offline storage is used for data retained for long periods of time and rarely accessed. Data often found on offline media includes old projects and archived email. Media used for offline storage is often the same as media used for backup such as tape drives and Optical media. When referring to offline storage we refer to archives, not backups. Archives are typically static whereas backups are typically dynamically changed with each backup. Offline storage still needs to be redundant or backed up, but the schedules for backup are often more lax than with that of other classifications of storage. In a Small or Medium Sized company offline media is often backed up, or duplicated, to the same type of media that it is housed on. There may be two copies of a tape (one onsite and one offsite) or two copies of DVD’s that the data has been burned onto, with each copy stored in a different physical location.

Near-line storage bridges the gap between online and offline storage by providing faster data access than archival storage at a lower cost than primary storage. Firewire Drives are often considered near-line storage because they are slower and usually not redundant. Near-line can refer to recent projects, old financial data, office forms that are updated rarely and backups of online storage to be made readily available for rapid recovery. Backup of Near-line storage will probably be to tape.

Data Classification

Mission Critical data is typically stored in online storage. This data is the day-to-day production data that drives information-based businesses. This includes the jobs being worked on by designers, the video being edited for commercials and movies, accounting data, legal data (for law firms) and current items within an organizations groupware system.

For the small business, Vital and Sensitive data are often one and the same. Vital data is data that is used in normal business practices but can be down for minutes or longer. Sensitive data is often accounting data that a company can live without for a short period of time, but will need to be restored in the event of a loss in a short amount of time. Small business will typically keep Vital and Sensitive data on the same type of media but may have different backup policies for it. For example, a company may choose to encrypt sensitive data and not vital data.

Non-Critical data includes items such as digital records and personal data files of network users. Non-Critical data could also include a duplicate of Mission Critical data from online storage. Non-Critical data often resides on near-line or off-line media (as is the case with Email archives). Non-critical data primarily refers to data kept as part of a companies risk management strategy or for regulatory compliance. This includes old emails and financial records and others.

Classification Methods

The chronological method for classifying data is often one of the easiest and most logical. For example, a design firm may keep their mission critical current jobs on an Xraid, vital jobs less than three months old on a Firewire drive attached to a server and non-critical jobs older than three months on backup tapes or offline Firewire drives. It would not be possible to implement this classification without having the data organized into jobs first. Another way to look at this method is that data over 180 days old automatically gets archived.

This characteristic method of data organization means that data with certain characteristics can be archived. This can applied to accounting and legal firms. Whether a client is active or not simply represents a characteristic. If a type of clothing is in style or not represents another possible characteristic. Provided that data is arranged or labeled by characteristic, it is possible to archive using a certain characteristic as a variable or metadata. Many small and medium sized companies are not using metadata for files yet, so a good substitution can be using a file name to denote attributes of the files data.

The hierarchical method of data organization means that files or folders within certain areas of the file system can be archived. For example, if a company decides to close down their Music Supervision department then the data stored in the Music Supervision share point on the server could be archived.

Service Level Agreements

The final piece of the ILM puzzle is building a Service Level Agreement for data management within a company. This is where the people that use each type of data within an organization sit down with IT and define how readily available that data needs to be and how often that data needs to be backed up.

In a Small Business it is often the owners of companies that make this decision. In many ways, this makes coming to terms with a Service Level Agreement easier than in a larger organization. The owner of a small business is more likely to have a picture of what the data can cost the company. When given the cost difference between online and near-line storage, small business owners are more likely to make concessions easier than managers of larger organizations who do not have as much of an ownership mentality towards a company.

Building a good Service Level Agreement means answering questions about the data, asked per classification. Some of the most important questions are:

How much data is there?How readily available does the data need to be?How much does this cost the company, including backups? Given the type of storage used to house this data, how much is it costing the company? If nearly half the data can be moved to near-line storage what will the savings be to the company? In the event of a loss, how far back in time is the company willing to go for retrieval? Is the data required it to be in an inalterable format for regulatory purposes? How fast must data be restored in the event of a loss? How fast must data be restored in the event of a catastrophe? Will client systems be backed up? If so, what on each client system will be backed up?

Information Lifecycle Management

Most companies will use a combination of methods to determine their data classification. Each classification should be mapped to a type of storage by building a SLA. Once this is done software programs such as BRU or Retrospect can be configured for automated archival and backups. The backup/archival software chosen will be the component that implements the SLA, so should fill the requirement of the ILM policies put into place.

The schedules for archival and backups should be set in accordance with the businesses needs. Some companies may choose to keep the same data in online storage for longer than other companies in the same business because they have invested more in online storage or because they reference the data often for other projects. The business logic of the organization will drive the schedule using the SLA as a roadmap.

Setting schedules means having documentation for what lives where and for how long. Information Lifecycle Management means bringing the actual data locations inline with where the data needs to be. Once this has been done, the cost to house and back up data becomes more quantifiable and cost efficient. The SLA is meant to be a guideline and should be revisited at roadblocks and intervals along the way. Checks and balances should be put into place to ensure that the actual data management situation accurately reflects the SLA.

ILM and regulatory compliance are more about people and business process than about required technology changes. The lifecycle of data is important to understand. As storage requirements spiral out of control, administrators of small and medium sized organizations can look to the methods of Enterprise networking for handling storage requirements with scalability and flexibility.

Mac Tiger Server Little Black Book Review

Monday, June 19th, 2006

Title: Mac Tiger Server Little Black Book, Author: Charles Edge Publisher: Paraglyph Press, distributed by O’Reilly Published: 2006 Price: $34.99 URL: http://www.oreilly.com/catalog/1933097140/

Roger Smith, SVMUG, June 18, 2006.

Audience: Users and system administrators trying to get the most out of Mac networking with Tiger Server.

Content: The book is divided into 18 chapters, each focused on some aspect of server functionality.

My opinion: Very much task-oriented, this would get a lot of use next to the console of a Tiger server. It is setting next to my server and will stay there.

There is an embarrassment of riches these days when it comes to OS X Server books. Until 10.2 there was nothing except some material on the Apple Web site.

Then Schoun Regan came out with Mac OS X Server Essentials, the first good book on Mac servers (Peachpit Press, Apple Training Series). But with each new edition, Schoun’s book is more oriented towards the budding Apple Consultant who wants to understand the various components of OS X Server and then pass his or her Apple certification exam. Several sections of Mac OS X Server Essentials are titled “Understanding this” and “Understanding that”. It is thorough book, but not suitable as a reference. It is also physically very heavy.

In contrast, “Mac Tiger Server Little Black Book” is intended as a handy reference for whatever task is at hand. Most chapters have an introductory “In brief” section that is two or three pages long. It is assumed that you understand, for example, the basics of networking. The rest of each chapter is “Immediate Solutions”, checklists and screen shots of how to accomplish the task at hand. Even the planning and installation chapter has “Immediate Solutions” like Choosing your Network infrastructure, Creating a Maintenance Plan, etc. Each chapter ends with a page or two of “Tips from the Trenches”, real world experience of these previous solutions in practice. The author has been there and done that, in the real world. “Troubleshooting …” is also a frequent topic heading.

The major Chapters are: Planning, Directory Services, Windows Services (I did mention it is real-world based, right?) Sharing Files, Network Services, Printing, Web, Mail and Streaming Servers, etc. Subjects also get into the more advanced area like VPNs, WebObjects, MySQL, Java Server Pages and Collaboration.

The Little Black book isn’t tiny at 377 pages, but is a convenient 6 by 9 inch format and is printed on light weight paper. It has index tabs on the margin so you can quick locate the section, and then the 2 or 3 page solution to your problem. The book was actually designed to be used!

— Roger Smith Complete System & Network Administration Windows, Mac, Sun, Cisco Apple Authorized Business Agent Microsoft Registered Partner 408-736-7200

318 Speaks at DefCon 2004: Charles Edge is Featured Speaker

Saturday, June 26th, 2004

This year’s DefCon seminar will cover the features and fundamental concepts of OS 10.3.4 Server. We will begin by describing the various roles of OS 10.3 SERVER in both small and medium sized offices. We will cover managing the webserver, email server and file storage. Finally, we will cover upgrading from 10.2 and data backup strategies.

Bio for Charles Edge:Charles is a Senior Systems Engineer for Three18, Inc. and is a leader within the technical department and a mentor to the other field technicians as well as a trusted advisor to hundreds of Three18′s companies here in Los Angeles. His 10+ years of experience, coupled with his in-depth knowledge of IP Routing, MAC OS, Windows and Linux have made him a valuable asset to both Three18 and its prestigious roster of clients.

Charles maintains certifications with Apple, Microsoft, Cisco and Comptia and is currently writing MAC OS X SERVER book for O’Reilly publishing, which should be on the shelves in early September 2004.