Your Ad Here

Saturday, March 7, 2009

Record Your Products: Reap The Rewards of Recording And Getting Your Product Done Faster And Easier.

 by: Patsy Bellah

Some of you will remember when we had to type on typewriters. Some of you, present company included, may even remember when we had to type on “standard” or manual typewriters. For those who aren’t in the know, that’s a typewriter without electricity

Then we got electric typewriters. That was something new to learn, but all our work could be done faster, easier and with less mess.

Then came computers. There was more to learn but with this technology life was made even easier for secretaries, writers, or anyone having to convey information with the written word.

With each of these advances there were those who said they couldn’t do it. They didn’t like it, they didn’t like change. They could get along just fine, thank you very much, with a manual typewriter, or an electric one. They didn’t need computers. There was too much to learn. It was too different.

Don’t let that attitude keep you from learning the latest time saver for transferring words to paper and that is the digital recorder. As the manual typewriter has given way to more sophisticated electric typewriters, which have given way to the computer, so, too, has the digital recorder made it faster and easier to transfer the spoken word to the written word.

On the average a one-hour recording will yield about 20-30 typewritten pages. That means that with a one hour “conversation,” speaking your story or information into a recording device, then getting it transcribed, you can transfer your spoken word to a document in about 25% of the time it would take you to type it yourself.

It may take a bit of practice to learn to dictate into a recorder, but once you have, you will find that you can save yourself a ton of time. Statistics prove that the longer it takes to complete a project, the less likely it is that you will finish it. Embrace this new technology.

Here are some guidelines you should consider when purchasing a digital recorder:

1. You must be able to download your recording to your computer. Some of the less expensive recorders are not “downloadable.” You need to be able to transfer your recording through the Internet in order to send it to a transcription service or even if you want to transcribe it yourself.

2. Although most recorders come with internal microphones, it is best to have the capability to attach an external microphone. External microphones work better to record presentations or to record from a distance. Additionally, you can elect to use a lavaliere microphone for yourself and not be hampered with holding the recorder. Or, if you are recording more than one person, such as if you are interviewing someone, you can get an attachment which allows you to hook up two microphones.

3. The recorder should have at least four hours of available recording time using the high quality recording setting. You want to make sure the recorder has enough time to record a full presentation before having to be downloaded to the computer.

The capabilities of recorders change all the time, and in my recent research I found that the prices, like anything else, are coming down drastically and we are getting more and more recording time.

I checked out the Olympus recorder on the Internet and found a very good quality recorder for around $100.00. I also found that you could buy this at Best Buy in the Los Angeles area at the same price. Other locations such as Samy’s Cameras for those in the Los Angeles area, Circuit City, Radio Shack and Frys may also have them.

For those of you who live in the Los Angeles area, I found an Olympus and a Marantz at Samy’s Cameras which uses a flash card and can get you as much as 4G-8G of storage space. Both of these sell for just under $400.00. The Sony or the Edirol are also good recorders, and have similar capabilities and prices.

Buying a recorder is much like buying a blender or a computer. Although it’s wise to buy as much as your pocketbook allows, at the same time, you don’t need to buy more than you will use. Why spend the extra money.

A digital recorder is small and easy to use. On it you can record all of your information products, plus your presentations, blogs or articles.

Embrace this new technology. Using a digital recorder to record your information product, presentations or teleseminars, will allow you to finish your product in less than 25% of the time it would take you to type it yourself. If you get your audio transcribed, once you get it back, all you have to do is edit it and you can have your product completed in less than a week.


Internet And Business Online – The Act Of Interdependence

 by: Scott Lindsay

The best role of business online is that of interdependency. We’ve all heard the old saying, “No man is an island.” When it comes to online business this is especially true.

If a business owner who takes their business into the online world determines they will be self reliant and never accept the help of anyone then that individual will not be in business long enough to change their minds.

It is accepted fact that the greatest tool for long-term exposure to your website is through Search Engine Optimization (SEO). Without it potential customers can’t find you. It is unreasonable to expect that you can adequately develop a website without optimizing your website for the best possible search engine ranking.

Search engines also place a high value on sites that have links placed on existing sites. These ‘backlinks’ demonstrate to search engines that others trust your site. By placing your link on their website these other businesses indicate a trust and recommendation for your site.

In effect the two strategies listed above rely exclusively on what others can do for you when it comes to your online business.

Shirley Temple once proclaimed in her movie Rebecca of Sunnybrook Farm, “I’m very self-reliant.” American westerns are filled with lines dealing with pulling yourself up by your bootstraps and holding down the fort. Many of us have grown up to believe if we want something done right we have to do it ourselves.

This thinking is in opposition to the rules associated with an online business.

The online world can only exist because people share. Individuals share technology, but the also share links, reviews, blogs, forums and a wide range of other marketing strategies that find a commingling of interdependency.

In online business you are as dependent on others as they may be on you. Unlike the word ‘dependent’, the term interdependent indicates a mutual dependency. In other words you are depending on others to help provide links back to your site while they are equally dependent on you (or others) for the success of their business.

Have you really taken a proactive approach to networking? It’s possible you are reading this today and you’ve never considered asking someone else to place a link to your site on his or her online business site.

It can feel awkward depending on others to achieve online success especially if you’ve been lead to believe reliance on others is also a sign of imposing on their otherwise brilliant generosity.

I suppose it could be a deep-seated sense of pride that makes it hard to consider the need to ask others for help. However, the truth is depending on others is really what has made the Internet possible. The growth of this online world is comprised of a link of computers, networks and servers that are connected in a way that provides the maximum benefit for all.

Building an online business can feel a bit like trying to build a house of cards. Without the ability to rely on the other ‘cards’ around you it is virtually impossible to build.

Interdependence. This is the essence of online business.



Web Development And The Big Time Out

by: Scott Lindsay

One of the great debilitators in online business is simply the perceived (or real) lack of time. Business owners are used to moving forward. An online web presence can make them feel tied to an office chair learning skills they aren’t sure they want to know.

It’s not uncommon for those who deal in full time web design to have individuals contact them for a site design, but have absolutely no idea what they want. Furthermore when the designer questions them the response might be, “I don’t know, just make it look nice.”

Let’s not forget the core values or mission of the business. Many business owners have no idea how to answer those kinds of questions. They may stare blankly for a moment or two and there’s no more time for further deep thought so they go back to action – without answers.

In many cases it is possible to answer some of the questions needed, but it may require taking time away from a familiar setting. It may also require more time than you think you want to give.

If you can get to a place of concentrated contemplation you are likely to find yourself stripping ideas to their core to find out what your business is trying to accomplish and what your ultimate goals might be.

As with almost any project you can turn frustration around if you will just take the time to come to terms with your vision.

Sometimes we spend so much time ‘doing’ we never stop to ask the question, “Why?”

This process can be a bit like taking a bus that drives around the park. You keep looking at the flowers and the park bench and long to sit in the quiet shade of a tree and just absorb the calming atmosphere. You know they will have a positive effect on you, but for some reason you just can’t seem to find the energy to get off the bus.

It seems to me there are some sites that are misguided or rarely guided that could benefit from the process of self-evaluation. These sites may look nice, but there is a sense of disconnection that may not be easy to identify, but it’s fairly obvious to visitors.

Creative energy is at a minimum while business owners simply tackle what seem to be the most urgent details.

As more people gravitate to online business there needs to be a shift in the thinking of how one goes about doing business online. In many ways it can’t be approached in the same way a traditional business is developed, yet that is typically the way many new web commerce ventures choose to tackle the subject.

You may discover your business will be more successful if you take some time for rigorous reflection. The time set aside can be a bit like an architect that takes the time to develop plans for a new building. You wouldn’t expect the architect to simply tell a construction crew to, “Go out there and build – something.”

Work at ‘building’ your online business in a comprehensive way. Your effort can develop a firm foundation for long-term success.


Back to Back User Agents for Telecommunications

by: Danny Loeb

Today’s telecommunications networks are a delicate blend of clients and servers that together offer virtually endless possibilities when it comes to services and applications. For every new client developed, there seems to be a score more on the way — from mobile handsets, PDAs, terminals, telephones, video phones, IP set-top-boxes, and so on.

There are essentially two types of servers that connect between clients on large networks: Proxy servers and Back-to-Back User Agent (B2BUA) servers. The more prevalent Proxy servers feature predictable behavior — simply connecting between clients. Effectively, B2BUA servers are much stronger and intelligent entities that perform actions which Proxy servers cannot. Moreover, B2BUA servers provide a flexible solution for a wide range of applications and services and are becoming the primary engine for more and more SIP servers in NGN and IMS networks.

The difference between Proxy servers and B2BUA servers is sometimes not fully understood. In this article, we will explore what makes B2BUA servers such an appealing alternative to standard Proxy servers. Better understanding of B2BUA servers can help managers understand the value, and the tradeoffs, of choosing a B2BUA server, as well as the frameworks needed to develop a wide range of SIP applications and SIP services using it.

Figure 1 - Architectural difference between Proxy servers and B2BUA servers

B2BUA Server Defined
B2BUA servers are used to provide value added features for point-to-point calls and manage multi-point calls. The power behind a B2BUA server is derived mostly from the fact that it has a very generic definition, which gives it almost unlimited power. However, this same characteristic is the root of the controversy surrounding it.

IETF standard (RFC 3261) defines a back-to-back user agent as “a logical entity that receives a request and processes it as a user agent server (UAS). In order to determine how the request should be answered, it acts as a user agent client (UAC) and generates requests. Unlike a Proxy server, it maintains a dialogue state and must participate in all requests sent on the dialogues it has established.”

B2BUA servers have capabilities that far exceed those of other types of SIP servers, and answer the need for developing sophisticated value added SIP applications that cannot be implemented as Proxy applications.

Some of these capabilities, which are unique to B2BUA servers, are outlined below:

3rd Party Call Control (3PCC) Features
3rd Party Call Control (3PCC) is the ability of an entity (usually a controller) to set up and manage communication between two or more parties. 3PCC is often used for operator services and conferencing.

3PCC actions are important capabilities, exclusive to B2BUA servers since “passive” non call-stateful elements, such as Proxy servers, cannot initiate these types of activities. Some examples of 3PCC services are online billing, QoS, resource prioritization, call transfer, click-to-dial, mid-call announcement and more.

3PCC actions can be initiated automatically by B2BUA server applications, like disconnecting a call following credit expiration in an online-billing system. Or they can be initiated by remote administrative control (OSS), e.g. invite parties to a multi-point conferencing session.

Figure 2 - Schematic outline of B2BUA server offering 3PCC functionality

Inter-working Function (IWF) for Interoperability

SIP was designed as a highly flexible and extendible protocol. The very strength of this flexibility is also an inherent weakness, since the vast array of client types in the market still need to connect.

B2BUA Inter-working Functions (IWF) defines a wide range of powerful SIP servers that connect SIP clients that “speak” in different protocol dialects, or support different capabilities. This Inter-working function is very important in enabling connectivity between clients with different capabilities and/or protocol dialects. Or even between clients and networks – where the B2BUA server actually acts as an access device.

Examples of what IWF can do include:

• Connecting SIP clients to IMS networks by adding and removing IMS SIP protocol extensions (AKA P-Headers) that are essential for connecting to the IMS network
• Connecting clients with different Session Timers settings
• Connecting clients with different media capabilities and with distinct Session Description Protocol (SDP) messages by relaying between the two types of control sessions
• Connecting to different types of networks (e.g. IPv4, IPv6) and support for different transport types, such as TCP/UDP/SCTP/TLS

Figure 3 - Schematic outline of a B2BUA Inter-Working Function

Multi-point Call Management

B2BUA servers an also implement multi-point call scenarios where multiple CPE devices connect to the B2BUA, and the B2BUA provides services to all CPE.

Due to these unique capabilities, B2BUA servers are widely used in the communications industry. A few examples are listed below:

• Online-billing/prepaid functions
• Servers supporting Resource Prioritization (RP) and/or Quality of Service (QoS) features
• Multi Point Conferencing servers
• IVR servers
• PBX Applications and Softswitches
• Application Layer Gateways (ALG)
• FW/NAT Traversal applications
• Privacy servers
• 3rd-Party Call Control Applications (3PCC)
• Service Creation Environment (SCE) runtime engines
• Session Boarder Controller (SBC)
• IMS S-CSCF, P-CSCF, I-CSCF
• SIP Inter-work Function (IWF) Gateway
• Security Gateway (SEG)
• Voice Call Continuity (VCC) servers

In addition, B2BUA servers play an important role in emerging IMS networks. Recent releases of 3GPP IMS specifications (3GPP TS 24.229 V8.0.0) indicate that an increasing number of IMS network element servers, such as P-CSCF, IBCF,SBC etc., are B2BUA servers. The reason for this is that value added services are usually session stateful, and feature capabilities that go beyond basic call proxying. Applications written on top of B2BUA Application servers fulfill several roles, such as SIP User Agents, SIP Proxy servers and SIP Registrars.

B2BUA Server Challenges

B2BUA application developers face many challenges, such as achieving rapid time-to-market, conformance and interoperability, offering customization for proprietary services and support for High Availability (HA) and redundancy. A comprehensive B2BUA framework can help developers overcome these challenges.

A solid B2BUA framework should have modular application building block architecture for increased flexibility, abstraction and short delivery time. Traditional architecture, which features a single configurable state machine, is not flexible enough. Also, a B2BUA framework should facilitate developing B2BUA applications by flexibly linking “pluggable” high-level Modular Application Building Blocks (MABB). Developers should have the ability to combine these MABBs and they should be designed in a way that allows developers to further customize their behavior if needed. This type of architecture complies with contemporary Service Oriented Architecture (SOA) concepts, and is suitable for powering flexible business communication platforms. This modular architecture can save months of work. With a set of MABBs in hand, developing the application is a matter of combining existing MABBs to produce the required business logic. In addition, this architecture enhances efficiency; development of new MABBs can be done concurrently.

A B2BUA framework should facilitate developing applications that fully conform to standards and are interoperable; without restricting developers from customizing protocol behavior for special cases. Moreover, it should conform for non-standard implementations, as well as to mediate between two versions of the same standard. This type of framework allows developers to focus on their proprietary application with the confidence that their final application will be fully interoperable.

And finally, a B2BUA framework should provide the ability to configure, amend and replace application building blocks to create proprietary features. With this ability, developers can maximize existing code – significantly reducing development time, shortening testing cycles, and reducing overall time-to-market.

Figure 4 - Traditional architecture of a B2BUA framework

RADVISION’s B2BUA Application Framework http://www.radvision.com/Products/Developer/SIPServer delivers these capabilities and more. The B2BUA Application Framework module is a part of the RADVISION SIP server Platform, a software framework that offers the essential building blocks for the development of a wide variety of high performance SIP and IMS servers. The rich set of components and modules can be flexibly combined to match customers’ requirements for developing SIP servers that offer both standard and advanced SIP services.

Applications written on top of RADVISION’s B2BUA framework are developed by combining customizable modular application building blocks. This is effectively large chunks of functionality that can be strung together to form ad-hoc applications, enabling developers to focus on the high-level business logic and use building blocks that hide low-level details.

As one of the most popular IM applications, Yahoo! Messenger was the first large consumer player that adopted B2B UA. Yahoo! Messenger combined its backend scalable platform with RADVISION’s B2B UA to serve millions of monthly unique messaging users around the world. Yahoo selected RADVISION’s B2BUA due to its robust performance and scalability features.

Figure 5 - The architecture of RADVISION B2BUA Application Framework


RADVISION also offers automatic High Availability (HA) and Redundancy support. The B2BUA framework automatically replicates the run-time state of the different Services and B2BUA framework core. In the event of a server outage, a redundant server takes over seamlessly and provides uninterrupted service continuity.

B2BUA framework benefits in a nutshell

• Significantly reduces time to market developing proprietary B2B applications and services.
• Allows adding advanced services easily to retain competitive advantage and evolve to meet growing customer demands.
• Focuses on the business logic and hides low level operator communication intricacies.
• Delivers off-the-shelf conformance and interoperability.
• Enables rapid development of applications that can interoperate with different vendors.
• Enables adding high-availability features easily.


The Battle of the Browsers – The History and the Future of Internet Browsers

 by: Nicholas C Smith

With Internet Explorer 8 now available, can Microsoft hope to retain market dominance over fierce open source rivals such as Mozilla's Firefox or the feature packed Opera web browser. Can history give us a clue to what the future of web browsers/browsing might hold? How did Netscape Navigator go from having a dominant 89.36% market share of all web browsers in 1996 and yet only 3.76% by mid 1999?

Let us take a journey that will begin long before even the intellectual conception of Internet Explorer, that will glance at its long defeated rivals, examine the current browsers available and will end with a prediction of what the future of browsing will offer us – and which browser(s) will still be around to offer it.

People often think that Internet Explorer has been the dominant web browser since the golden age of the internet began. Well for a very long time now it has indeed been the most popular browser and at times been almost totally unrivalled. This was mainly a result of it being packaged free with Microsoft Windows, in what some would later call a brutal monopolisation attempt by Microsoft. The last few years however have heralded the arrival of new, possibly superior browsers. Mozilla's Firefox has been particularly successful at chipping away at Explorers market dominance. So where did it all begin, and why were Microsoft ever allowed to have a hundred percent market dominance?

Origins

The truth is they never did have total dominance, but at times they have come very close. Microsoft actually entered the Browser Battle quite late on. Infact a man named Neil Larson is credited to be one of the originators of internet browsers, when in 1977 he created a program – The TRS-80 - that allowed browsing between “sites” via hypertext jumps. This was a DOS program and the basis of much to come. Slowly other browsers powered by DOS and inspired by the TRS 80 were developed. Unfortunately they were often constricted by the limitations of the still fairly young internet itself.

In 1988, Peter Scott and Earle Fogel created a simple, fast browser called Hytelnet, which by 1990 offered users instant logon and access to the online catalogues of over five thousand libraries around the world – an exhilarating taste of what the internet, and web browsers, would soon be able to offer.

In 1989 the original World Wide Web was born. Using a NeXTcube computer, Tim Berners-Lee created a web browser that would change how people used the internet forever. He called his browser the WorldWideWeb which is still likely to sound familiar to internet users today. It was a windowed browser capable of displaying simple style sheet, capable of editing sites and able to download and open any file type supported by the NeXTcube.

In 1993 the first popular graphical browser was released. Its name was Mosaic and it was created by Marc Andreessen and Eric Bina. Mosaic could be run on both Unix, and very importantly, on the highly popular Microsoft Windows operating system (incidentally it could also be used on Amiga and Apple computers). It was the first browser on Windows that could display graphics/pictures on a page where there was also textual content. It is often cited as being responsible for triggering the internet boom due to it making the internet bearable for the masses. (It should be noted that the web browser Cello was the first browser to be used on Windows – but it was non graphical and made very little impact compared to Mosaic).

The Browser Wars - Netscape Navigator versus Internet Explorer

Mosaic's decline began almost as soon as Netscape Navigator was released (1994). Netscape Navigator was a browser created by Marc Andreessen, one of the men behind Mosaic and co-founder of Netscape Communications Corporation. Netscape was unrivalled in terms of features and usability at the time. For example, one major change from previous browsers was that it allowed surfers to see parts of a website before the whole site was downloaded. This meant that people did not have to wait for minutes simply to see if the site they were loading was the actual one the were after, whilst also allowing them to read information on the site as the rest of it downloaded. By 1996 Netscape had almost 90% market dominance, as shown below.

Market Share Comparisons of Netscape Navigator and Internet Explorer from 1996 to 1998

....................Netscape.......IE
October 1998..........64%.........32.2%
April 1998............70%.........22.7%
October 1997..........59.67%......15.13%
April 1997............81.13%......12.13%
October 1996..........80.45%......12.18%
April 1996............89.36%.......3.76%

In these two years Netscape clearly dominated the internet browser market, but a new browser named Internet Explorer was quickly gaining ground on it.

Microsoft released their own browser (ironically based on the earlier Mosaic browser which was created by one of the men now running Netscape), clearly worried about Netscape's dominance. It was not so much the worry that it would have a 100% market share of internet browsers on their Windows operating system, but more the worry that browsers would soon be capable of running all types programs on them. That would mean foregoing the need for an actual operating system, or at the most only a very basic one would be needed. This in turn would mean Netscape would soon be able to dictate terms to Microsoft, and Microsoft were not going to let that happen easily. Thus in August 1995, Internet Explorer was released.

By 1999 Internet explorer had captured an 89.03% market share, whilst Netscape was down to 10.47%. How could Internet Explorer make this much ground in just two years? Well this was down to two things really. The first, and by far the most important was that Microsoft bundled Internet Explorer in with every new copy of Windows, and as Windows was used by about 90% of the computer using population it clearly gave them a huge advantage. Internet Explorer had one other ace it held over Netscape – it was much better. Netscape Navigator was stagnant and had been for some time. The only new features it ever seemed to introduce were often perceived by the public as beneficial for Netscape's parent company rather than Netscape's user base. (i.e., features that would help it monopolise the market). Explorer, on the other hand, was given much attention by Microsoft. Regular updates and excellent usability plus a hundred million dollar investment would prove too much for Netscape Explorer.

2000 – 2005

These years were fairly quiet in the Battle of the Browsers. It seemed as if Internet Explorer had won the war and that nobody could even hope to compete with it. In 2002/2003 it had attained about 95% of the market share – about the time of IE 5/6. With over 1000 people working on it and millions of dollars being poured in, few people had the resources to compete. Then again, who wanted to compete? It was clearly a volatile market, and besides that everybody was content with Internet Explorer. Or were they? Some people saw faults with IE – security issues, incompatibility issues or simply bad programming. Not only that, it was being shoved down peoples throats. There was almost no competition to keep it in line or to turn to as an alternative. Something had to change. The only people with the ability and the power to compete with Microsoft took matters into their own hands.

Netscape was now supported by AOL. A few years prior, just after they had lost the Browser Wars to Microsoft, they had released the coding for Netscape into the public domain. This meant anybody could develop their own browser using the Netscape skeleton. And people did. Epiphany, Galeon and Camino, amongst others, were born out of Netscape's ashes. However the two most popular newcomers were called Mozilla and Firefox.

Mozilla was originally an open sourced project aimed to improve the Netscape browser. Eventually it was released as Netscape Navigator 7 and then 8. Later it was released as Mozilla 1.0.

Mozilla was almost an early version on another open source browser, Firefox. With it being an open source the public were able to contribute to it - adding in what features it needed, the programming it required and the support it deserved. The problems people saw in Internet Explorer were being fixed by members of the open sourced browser community via Firefox. For instance, the many security issues IE 6 had were almost entirely fixed in the very first release of Firefox. Microsoft had another fight on their hands.

2005 – Present

Firefox was the browser that grew and grew in these years. Every year capturing an even larger market share percentage than before. More user friendly than most of its rivals along with high security levels and arguably more intelligent programming helped its popularity. With such a large programming community behind it, updates have always been regular and add on programs/features are often released. It prides itself on being the peoples browser. It currently has a 28.38% market share.

Apple computers have had their own browser since the mid 1990's – Safari - complete with its own problems, such as (until recently) the inability to run Java scripts. However most Apple users seemed happy with it and a version capable of running on Windows has been released. It has had no major competitor on Apple Macs, and as such has largely been out of the Browser Wars. It currently holds a 2.54% market share and is slowly increasing.

Internet Explorer's market share has dropped from over 90% to around 75%, and is falling. It will be interesting to see what Microsoft will attempt to regain such a high market share.

Opera currently holds 1.07%.

Mozilla itself only has a 0.6% market share these days.

The Future of Web Browsing

Web browsers come and go. It is the nature of technology (if such a term can be used), to supplant inferior software in very short periods of time. It is almost impossible for a single company to stay ahead of the competition for long. Microsoft have the advantage of being able to release IE with any Windows using PC. That covers over 90% of the market. They also have the advantage of unprecedented resources. They can compete how they wish for as long as they wish. So there is no counting IE out of the future of web browsing.

Safari is in a similar position, being easily the most popular Mac web browser. Its long term survival is dependant upon Apple and the sale of their computers.

These are the only two browsers that are almost guaranteed another five years of life, at least. Firefox may seem like another candidate, but the public is fickle, and one bad release, or if it seriously lags behind the new Internet Explorer 8 for long, could easily see its popularity quickly descend into virtual oblivion.

However, it seems likely community driven browsers, such as Mozilla and Firefox, will be the only types of browser capable of competing with the wealthy internet arm of Microsoft in the near future.

As for web browsing itself, will it change any time soon? Well it already has for some online communities. For example, if you want to buy clothes you could try entering an online 'world' creating an online virtual You to go from 'shop to shop' with, looking at products and trying/buying what you see. Some 'worlds' allow you to recreate yourself accurately including weight and height and then try on things apparel such as jeans to give you an idea of how you would look in that particular item.

Will 'worlds' like this destroy normal web browsers such as IE ? - It seems unlikely. Traditional web browsers provide such freedom and ease of access that it is hard to see any other alternative taking over. However they are part of the new, 'thinking out of the box' wave of alternatives that some people will find attractive, and really who knows what the future will bring.


Can Data Breaches Be Expected From Bankrupt Mortgage Lenders?

by: Tim Maliyil

The stock market is in a tumult. Actually, it has been for about a year, ever since the subprime fiasco (anyone take a look at Moody's performance over the past year?) Now that that particular issue has been beaten to death, other mortgage related issues are cropping up. Most of the stuff covered in the media is financial in nature, but some of those mortgage related issues do concern information security.

It's no secret that there are plenty of companies in the US that discard sensitive documents by dumping them unceremoniously: leave it by the curb, drive it to a dumpster, heave it over the walls of abandoned property, and other assorted mind boggling insecure practices. In fact, MSNBC has an article on this issue, and names numerous bankrupt mortgage companies whose borrowers' records were found in dumpsters and recycling centers. The information on those documents include credit card numbers and SSNs, as well as addresses, names, and other information needed to secure a mortgage.

Since the companies have filed for bankruptcy and are no more, the potential victims involved have no legal recourse, and are left to fend for themselves. In a way, it makes sense that companies that have filed for bankruptcy are behaving this way. (Not that I'm saying this is proper procedure.) For starters, if a company does wrong, one goes after the company; however, the company has filed for bankruptcy, it is no more, so there's no one to "go after." In light of the company status, this means that the actual person remaining behind to dispose of things, be they desks or credit applications, can opt to do whatever he feels like. He could shred the applications. He could dump them nearby. He could walk away and let the building's owner take care of them. What does he care? It's not as if he's gonna get fired.

Also, proper disposal requires either time, money, or both. A bankrupt company doesn't have money. It may have time, assuming people are going to stick around, but chances are their shredder has been seized by creditors. People are not going to stick around to shred things by hand, literally.

Aren't there any laws regulating this? Apparently, such issues are covered by FACTA, the Fair and Accurate Credit Transactions Act, and although its guidelines require that "businesses to dispose of sensitive financial documents in a way that protects against 'unauthorized access to or use of the information'" [msnbc.com], it stops short of requiring the physical destruction of data. I'm not a lawyer, but perhaps there's enough leeway in the language for one to go around dropping sensitive documents in dumpsters?

Like I mentioned before, inappropriate disposal of sensitive documents has been going on forever; I'm pretty sure this has been a problem since the very first mortgage was issued. My personal belief is that most companies would act responsibly and try to properly dispose of such information. But, this may prove to be a point of concern as well because of widespread misconceptions of what it means to protect data against unauthorized access.

What happens if a company that files for bankruptcy decides to sell their company computers to pay off creditors? Most people would delete the information found in the computer, and that's that-end of story. Except, it's not. When files are deleted, the actual data still resides in the hard disks; it's just that the computer's operating system doesn't have a way to find the information anymore. Indeed, this is how retail data restoration applications such as Norton are able to recover accidentally deleted files.

Some may be aware of this and decide to format the entire computer before sending it off to the new owners. The problem with this approach is the same as deleting files: data recovery is a cinch with the right software. Some of them retail for $30 or less-as in free. So, the sensitive data that's supposed to be deleted can be recovered, if not easily, at least cheaply-perhaps by people with criminal interests.

Am I being paranoid? I don't think so. I've been tracking fraud for years now, and I can't help but conclude that the criminal underworld has plenty of people looking to be niche operators, not to mention that there are infinitesimal ways of defrauding people (look up "salad oil" and "American Express," for an example). An identification theft ring looking to collect sensitive information from bankrupt mortgage dealers wouldn't surprise me, especially in an environment where such companies are dropping left and right.

The economics behind it make sense as well. A used computer will retail anywhere from $100 to $500. The information in it, if not wiped correctly, will average many times more even if you factor in the purchase of data recovery software. Criminals have different ways of capitalizing on personal data, ranging from selling the information outright to engaging in something with better returns.

Is there a better way to protect oneself? Whole disk encryption is a way to ensure that such problems do not occur: One can just reformat the encrypted drive itself to install a new OS; the original data remains encrypted, so there's no way to extract the data. Plus, the added benefit is that the data is protected in the event that a computer gets lost or stolen. However, commonsense dictates that encryption is something ongoing concerns sign up for, not businesses about to go bankrupt. My guess is that sooner or later we'll find instances of data breaches originating from equipment being traced back to bankrupt mortgage dealers.

The stock market is in a tumult. Actually, it has been for about a year, ever since the subprime fiasco (anyone take a look at Moody's performance over the past year?) Now that that particular issue has been beaten to death, other mortgagerelated issues are cropping up. Most of the stuff covered in the media is financial in nature, but some of those mortgagerelated issues do concern information security.

It's no secret that there are plenty of companies in the US that discard sensitive documents by dumping them unceremoniously: leave it by the curb, drive it to a dumpster, heave it over the walls of abandoned property, and other assorted mindboggling insecure practices. In fact, MSNBC has an article on this issue, and names numerous bankrupt mortgage companies whose borrowers' records were found in dumpsters and recycling centers. The information on those documents include credit card numbers and SSNs, as well as addresses, names, and other information needed to secure a mortgage.

Since the companies have filed for bankruptcy and are no more, the potential victims involved have no legal recourse, and are left to fend for themselves. In a way, it makes sense that companies that have filed for bankruptcy are behaving this way. (Not that I'm saying this is proper procedure.) For starters, if a company does wrong, one goes after the company; however, the company has filed for bankruptcy, it is no more, so there's no one to "go after." In light of the company status, this means that the actual person remaining behind to dispose of things, be they desks or credit applications, can opt to do whatever he feels like. He could shred the applications. He could dump them nearby. He could walk away and let the building's owner take care of them. What does he care? It's not as if he's gonna get fired.

Also, proper disposal requires either time, money, or both. A bankrupt company doesn't have money. It may have time, assuming people are going to stick around, but chances are their shredder has been seized by creditors. People are not going to stick around to shred things by hand, literally.

Aren't there any laws regulating this? Apparently, such issues are covered by FACTA, the Fair and Accurate Credit Transactions Act, and although its guidelines require that "businesses to dispose of sensitive financial documents in a way that protects against 'unauthorized access to or use of the information'" [msnbc.com], it stops short of requiring the physical destruction of data. I'm not a lawyer, but perhaps there's enough leeway in the language for one to go around dropping sensitive documents in dumpsters?

Like I mentioned before, inappropriate disposal of sensitive documents has been going on forever; I'm pretty sure this has been a problem since the very first mortgage was issued. My personal belief is that most companies would act responsibly and try to properly dispose of such information. But, this may prove to be a point of concern as well because of widespread misconceptions of what it means to protect data against unauthorized access.

What happens if a company that files for bankruptcy decides to sell their company computers to pay off creditors? Most people would delete the information found in the computer, and that's that-end of story. Except, it's not. When files are deleted, the actual data still resides in the hard disks; it's just that the computer's operating system doesn't have a way to find the information anymore. Indeed, this is how retail data restoration applications such as Norton are able to recover accidentally deleted files.

Some may be aware of this and decide to format the entire computer before sending it off to the new owners. The problem with this approach is the same as deleting files: data recovery is a cinch with the right software. Some of them retail for $30 or less-as in free. So, the sensitive data that's supposed to be deleted can be recovered, if not easily, at least cheaply-perhaps by people with criminal interests.

Am I being paranoid? I don't think so. I've been tracking fraud for years now, and I can't help but conclude that the criminal underworld has plenty of people looking to be niche operators, not to mention that there are infinitesimal ways of defrauding people (look up "salad oil" and "American Express," for an example). An identification theft ring looking to collect sensitive information from bankrupt mortgage dealers wouldn't surprise me, especially in an environment where such companies are dropping left and right.

The economics behind it make sense as well. A used computer will retail anywhere from $100 to $500. The information in it, if not wiped correctly, will average many times more even if you factor in the purchase of data recovery software. Criminals have different ways of capitalizing on personal data, ranging from selling the information outright to engaging in something with better returns.

Is there a better way to protect oneself? Whole disk encryption is a way to ensure that such problems do not occur: One can just reformat the encrypted drive itself to install a new OS; the original data remains encrypted, so there's no way to extract the data. Plus, the added benefit is that the data is protected in the event that a computer gets lost or stolen. However, commonsense dictates that encryption is something ongoing concerns sign up for, not businesses about to go bankrupt. My guess is that sooner or later we'll find instances of data breaches originating from equipment being traced back to bankrupt mortgage dealers.


Wednesday, March 4, 2009

Overcoming layoff survivor sickness

"They can't seem to snap back. The merger is over, the layoffs are behind us. We've streamlined the work flow, but productivity is actually lower than before we started. Am I missing something?"

These were-the introductory words of the general manager of a recently merged division of a process engineering organization, and he was missing something. It was the reality of layoff survivor sickness.

Irrevocable paradigm shift

In slightly more than 10 years the paradigm that connects people to organizations has changed from people as long-term assets to be nurtured and developed over their careers to people as short-term costs to be managed and, hopefully, reduced. Harsh as it may initially seem, this new paradigm is reality in most organizations, and neither individuals nor organizations can afford to ignore it and hope it will go away.

The underlying cause of layoff survivor sickness is a pervasive sense of personal violation. In the old paradigm, organizations found ways to tie employees' sense of purpose and relevance to the company through tenure-based benefit plans, compensation systems that rewarded longevity and not contribution, status symbols that emphasized time with the organization, and social connections such as employee clubs and athletic teams. They caused employees to put all of their social and emotional eggs in the organizational basket, and then they dropped the basket.

These organizations were not evil or manipulative; tying employees in over the long haul was a central premise of the old paradigm. The problem is the old strategy does not work in the new reality. A question I often ask is, "If who you are is where you work, what is at threat if your job is at threat?" The answer is a lot more than a paycheck.

Survivors of mergers, reengineering efforts, and other events that lead to downsizing experience powerful and often disabling survivor feelings. Research shows that employees who have bought into the old paradigm and find themselves in the new reality experience feelings of guilt, anxiety, depression, fear, and anger. They are risk averse and emotionally drained, and they operate with reduced productivity. Organizations that have reductions on Friday and expect increased productivity on Monday are always surprised. Unfortunately, layoff survivor symptoms are long term and often don't go away without external intervention and help.

What individuals can do

Individuals can do three things. First, understand you are not the lone ranger. As companies merge and reengineer themselves in the necessary quest for efficiency and survival, employees often experience a sense of violation that leads to layoff survivor symptoms. They are caught in a paradigm shift where their values were formed in the old reality and they now live in a different world.

The second action involves what psychologists call emotional catharsis and grieving. Although these are somewhat intimidating and nonbusinesslike terms, you don't have to see a psychologist to experience them, and they are definitely helpful to the business.

What is involved is externalizing feelings and emotions that, through social pressure and fear, many people keep locked inside. If you are angry, fearful, and depressed, you are not doing yourself or your organization any good by keeping these feelings inside. You can have the best resume and the most up-to-date job skills, but without finding ways to talk about and externalize your feelings, you are going to be severely handicapped. It can be as simple as talking to your spouse, a good boss, a friend, or clergy. Or you may need professional help. Whatever it takes, the basic requirement is to put things on the table-talk about and deal with your survivor symptoms.

The third step involves breaking the conditioning of the old paradigm and finding purpose and satisfaction in your work, not in the organization where you work In the final analysis, we are all temporary employees, and "who we are" should not be where we work but what we do. The best and happiest employees are those who are in organizations because they choose to be there.

What bosses can do

Many organizations forget that people in supervisory roles- are also employees and not immune to survivor issues. "Physician, heal thyself" is the rule, and if you are a boss, you need to first deal with your own survivor issues. The next thing you need to do is help your employees.

The best way to help employees shed the shackles of layoff survivor sickness is to facilitate emotional release and move forward into productivity. This activity often goes against the grain because many organizations are uneasy about airing feelings and emotions. To be effective, a boss needs to take a risk. This can take the form of group meetings that stimulate frank discussions of emotions or one-on-one meetings with employees to listen to their feelings. This is the most important task leaders have in postmerger and reengineering environments.

Necessary wake-up call

Although hard to see in the depth of layoff survivor symptoms, there have been some very positive outcomes from the new paradigm. For individuals, the new reality has served as a wakeup call. Employees have taken stock of their lives and careers and recaptured their self-esteem. For organizations, the benefit is employees who focus on their work and their customers, not on pleasing the boss or playing politics.

Once employees ground their self-esteem in the work they do rather than where they do it, they become much more productive. These liberated employees cannot be managed by fear and intimidation. They need meaningful and relevant work that is valued by a customer. This results in a win-win solution for everyone.

Article by : david@noercounsulting.com