Turbotodd

Ruminations on tech, the digital media, and some golf thrown in for good measure.

Archive for June 4th, 2012

Qualifying For The U.S. Open

leave a comment »

I committed previously to providing some insights leading into and during the U.S. Open in the Turbo blog, and I’m going to try and stand by that commitment!

Whether or not you’re a golf fan, it makes no difference — it’s my hope you’ll learn something in either case.

The first thing to know about the U.S. Open is that it holds the promise for entry to any qualified golfer.  The qualification occurs by offering every year, thousands of golfers both pro and amateur alike, with a U.S. Golf Association Handicap Index not exceeding 1.4 — the opportunity to play in the Open.

The Handicap Index is the great and brilliant equalizer in golf, allowing golfers of all “handicaps” the opportunity to compete with one another in tournaments across the country.  The Index takes into account your level of play, then offers you a “handicap” to equalize the competitive landscape when you’re playing someone with, for instance, a much lower handicap.

By way of example, this week at my father’s home course, the Denton Country Club, we’ll be competing in an annual “Member-Guest” tournament (I’m the guest!).  My handicap index is 14.2, which will help his country club match my index to the complexity level (or slope) of their course, and that way, when we get flighted for the tournament, we’ll be playing against players of a similar ability.

In the case of the U.S. Open, it’s much more level, because all the players have to have an index of 1.4 or better, which means they have to be darn near scratch golfers.

That’s just to get in to the qualifier.  In 2012, there were 109 local qualifiers from April 30 to May 17.  Each of those local qualifiers consisted of 18 holes, with a select number of players advancing then to the 36-hole “sectional” qualifiers (again, with the number of available spots determined by the number of players at the local qualifier.)

A very small number of golfers manage to navigate both stages of qualifying to earn a spot in the 156-player U.S. Open.  In 1964, Ken Venturi claimed the championship after competing in both the local and sectional qualifying, and Orville Moody did the same in 1969.  In no other professional tournament can rank amateurs rise to compete with the best of the best and actually walk away with the Championship trophy!

As former USGA Executive Director David B. Fay referred to it, the U.S. Open is “the most democratic championship” in golf.

In the sectional qualifying, which is the final stage before U.S. Open hopefuls get to the championship proper, the USGA offers 13 sectional sites – 11 in the U.S. and two overseas in Japan and England. Generally, about 750 golfers compete at the sectional qualifying level for about half of the 156 available spots in the U.S. Open.

Sectional qualifying is a grueling 36-hole one-day marathon, with only a handful of available spots at each site. The USGA established two “tour” sites in Columbus, Ohio, and Memphis, Tenn., for members of the PGA Tour who either have just competed at The Memorial (Columbus) or are preparing to play the FedEx St. Jude Classic (Memphis).

In 2005, the USGA established two international qualifiers; one in Japan (Japan, Asia and Australasian tours) and another in England (European Tour). In its first year, Michael Campbell of New Zealand not only qualified in England, but went on to claim the U.S. Open title at Pinehurst No. 2 in North Carolina!

So, that’s the story behind the story for U.S. Open qualifying.  It truly is the Everyman golfer’s championship, and is one of the reasons we regular “Joes” get so excited, as even “we” have a chance to win the Open!

In a future post, I’ll share some history behind the U.S. Open.

Written by turbotodd

June 4, 2012 at 9:08 pm

IBM Expands Collaborative Software Development Solutions to Cloud, Mobile Technologies

leave a comment »

At IBM Innovate in Orlando earlier today, the company announced a range of new software solutions that will help clients create software applications faster and with higher quality across multiple development environments including cloud, mobile, and complex systems.

The software world’s push toward continuously evolving systems necessitates consistency and collaboration across the entire software lifecycle and supply chain. Often software development teams are struggling to meet business expectations due to a lack of hard facts.

There is a need for shared data and a consistent context across organizational boundaries, exposed through clear and honest metrics.

To address these challenges, IBM is introducing a new version of its integrated software Collaborative Lifecycle Management (CLM) solution with extended design management capabilities.

CLM is built on IBM’s open development platform, Jazz, and brings together IBM Rational Requirements Composer, IBM Rational Team Concert, and IBM Rational Quality Manager in one easy-to-install and easy-to-use solution. The new CLM software ensures that software design is integrated with the rest of the software application development lifecycle.

Development teams are now able to seamlessly collaborate on the design and development of software with key stakeholders from across the business.

According to preliminary findings of an IBM Institute for Business Value Global Study on software delivery, more than three-fourths of the participating organizations said they are underprepared for major technology trends that will impact their competitiveness.

These trends include the proliferation of mobile devices, the ability to leverage cloud-based resources for flexibility and savings, and the growing percentage of smart products with embedded software. While 50 percent of organizations believe successful software delivery is crucial to their competitive advantage, only 25 percent currently leverage it.

“Today’s business dilemma is how to address both the need for rapid delivery and sufficient control in the software development process,” said Dr. Kristof Kloeckner, general manager, IBM Rational. “We must balance the need for speed and agility with better governance to manage cost and quality, achieve regulatory compliance, ensure security, and have some level of financial predictability.”

Top Bank in China Transforms Core Processes

China Merchants Bank (CMB), headquartered in Shenzhen, China, has over 800 branches, more than 50,000 employees and is cited as one of the world’s top 100 banks. China Merchants Bank environment spans IBM System z and IBM Power platforms.

With geographically dispersed developers responsible for modernizing core banking and credit card processing applications, collaboration became essential. CMB uses IBM Rational CLM software capabilities to create a multiplatform application lifecycle management (ALM) environment to help automate their development processes and breakdown skills silos for effective cross-teaming.

“IBM Rational Developer and ALM tools were brought into our credit card migration and core banking system project,” said Zhanwen Chen, manager of configuration management, China Merchants Bank. “Replacing older tools and coordinating the efforts of our 1,000+ developers improved our quality and performance.”

DevOps in the Cloud

In a typical organization, it may take weeks or months to deliver a development change, due to infrastructure and configuration, testing and manual deployment, and lack of collaboration between development and operations teams.

Continuous software delivery in the cloud allows customers to continuously and automatically deliver changes across the enterprise software delivery lifecycle, spanning development, application testing and operations. With a “DevOps” approach in the cloud, customers can reduce time to market and automate changes in development, test and production.

IBM is supporting cloud delivery, development and operations with new solutions, including:

  • IBM Rational solution for Collaborative Lifecycle Management on IBM SmartCloud Enterprise provides an agile cloud computing infrastructure as a service (IaaS) well suited for development and test that is designed to provide rapid access to secure, enterprise-class virtual server environments.
  • The IBM SmartCloud Application Services pilot provides a pay-as-you-go service that coordinates activities across business and system requirements, design, development, build, test and delivery.
  • IBM SmartCloud for Government Development and Test Platform as a service delivers industry-leading Rational tools for government agencies in a highly scalable, elastic computing environment for agencies that want the cost savings of a shared cloud environment combined with Federal Information Security Management Act (FISMA) security.
  • IBM SmartCloud Continuous Delivery managed beta via a hosted sandbox in the cloud, provides a hands-on-experience of DevOps capabilities enabling accelerated code-to-deploy through automation, standardization of repeatable processes and improved coordination and visibility among development, test and operations teams.
  • IBM SmartCloud Application Performance Management software provides comprehensive monitoring and management capabilities that enable development and operations professionals to reduce costly troubleshooting. It also provides free resources to focus on developing new innovations and services for customers. With this tighter integration, application issues can be found and resolved faster, but also proactively prevented to avoid future service disruption.

Enterprise Mobile Development

IBM Rational CLM has also been extended to the IBM Mobile Foundation platform for centralized code sharing and distributed mobile application development.

Currently, fragmentation of mobile devices, tools, and platforms complicates delivery of mobile applications that typically have faster time-to-market and more frequent releases.

The IBM Enterprise Mobile Development solution helps teams apply an end-to-end lifecycle management process to design, develop, test and deploy mobile applications while enabling seamless integration with enterprise back-end systems and cloud services through mobile-optimized middleware. The Enterprise Mobile Development solution brings together several offerings that optimize the recent Worklight acquisition as well as IBM enterprise development environments, including:

Green Hat Technology in New IBM Test Automation Solutions

Today’s applications and manufactured products put additional pressures on development teams to find innovative ways to attain agility and increase the rate that software updates are delivered for testing.

IBM has integrated the recently acquired Green Hat technology with IBM Rational CLM to help address the challenges of testing highly integrated and complex systems and simplify the creation of virtual test environments.

New IBM test automation solutions use virtualized test environments and can reduce costs associated with the setup, maintenance and tear down of infrastructure associated with traditional testing or cloud based implementations.

Over a Decade of IBM Software Development Leadership

For the eleventh consecutive year, IBM has been named the number one shareholder in the worldwide application development software market according to Gartner with 25 percent of the market.

Gartner reported that IBM continues to lead in key and growing segments includingDistributed Software Change & Configuration Management, Requirements Elicitation and Management, Design and Java Platform AD Tools, and realized 25 percent growth in the Security Testing (DAST & SAST) market.

Additionally, according to Evans Data Corporation’s Users’ Choice: 2012 Software Development Platforms, for the overall platform rankings, IBM’s Rational continues its reign as the most highly rated overall offering, an honor they have obtained 6 in the last 7 years in this Evans Data survey of 1,200 developers globally.

IBM & Syracuse: Building Critical Software Development Skills

leave a comment »

If you’ve been watching any of the Livestream coverage emerging from the IBM Innovate event down in Orlando, you know that skills is a key issue facing software development shops everywhere.  The need for new and changing skills, skills for new platforms and development languages, skills to help pull it all together.

Today, IBM made an announcement from Innovate that it is working to help address the skills issue in a new partnership with Syracuse University intended to help college students build computing skills to manage traditional and new systems in large global enterprises.

As business value creation increasingly shifts to software, the skills needed to tackle disruptive technologies like cloud and mobile computing, particularly for enterprise-class, large industrial systems, have become critical.

Lack of employee skills in software technologies is cited as the top barrier that prevents organizations from leveraging software for a competitive advantage, according to initial findings in IBM’s Institute for Business Value 2012 Global Study on Software Delivery.

And according to IBM’s 2012 Global CEO Study, including input from more than 1,700 Chief Executive Officers from 64 countries and 18 industries, a majority (71 percent) of global CEOs regard technology as the number one factor to impact an organization’s future over the next three years — considered to be an even bigger change agent than shifting economic and market conditions.  

Syracuse GETs Skills

Syracuse University’s Global Enterprise Technology (GET) curriculum is an interdisciplinary program focused on preparing students for successful careers in large-scale, technology-driven global operating environments.

IBM and a consortium of partners provide technology platforms and multiple systems experience for the GET students. IBM’s Rational Developer for System z (RDz) and z Enterprise Systems help students build applications on multiple systems platforms including z/OS, AIX, Linux and Windows.

“Our students need to build relevant skills to address the sheer growth of computing and Big Data,” said David Dischiave, assistant professor and the director of the graduate Information Management Program in the School of Information Studies (iSchool) at Syracuse University. “These courses and the IBM technology platform help prepare students to build large global data centers, allow them to work across multiple systems, and ultimately gain employment in large global enterprises.”

Close to 500 students have participated in the Global Enterprise Technology minor since its inception. Syracuse University’s iSchool is the No. 1 school for information systems study, as ranked by U.S. News and World Report, and serves as a model for other iSchools that are emerging around the globe.

Back To The Mainframe Future

More than 120 new clients worldwide have chosen the IBM mainframe platform as a backbone of their IT infrastructure since the IBM zEnterprise system was introduced in July 2010.

The zEnterprise is a workload-optimized, multi-architecture system capable of hosting many workloads integrated together, and efficiently managed as a single entity.

Syracuse University is a participant in IBM’s Academic Initiative and was a top ranked competitor in IBM’s 2011 Master the Mainframe competition.

As today’s mainframes grow in popularity and require a new generation of mainframe experts, the contest is designed to equip students with basic skills to make them more competitive in the enterprise computing industry job market.

IBM’s Academic Initiative offers a wide range of technology education benefits to meet the goals of colleges and universities. Over 6,000 universities and 30,000 faculty members worldwide have joined IBM’s Academic Initiative over the past five years.

Flame No Game

leave a comment »

What a week for cybsecurity matters last week was.

First, the story about the Flame virus discovered by Kapersky Labs in Russia, a new and improved “Stuxnet” virus that has apparently infiltrated computers throughout Iran (and, it seems, beyond).

Then, The New York Times reported on the code-named “Olympic Games” cyberintrusion program, in which the U.S. and Israel allegedly developed Stuxnet for the express purpose of disabling Iranian centrifuges that were being used to enrich uranium.

If you ever had the question as to when or whether the digital realm would meet that of the physical, Stuxnet and, now, Flame, are perfectly good examples of how that intersection is being brought about.

But Eugene Kasperksy himself, who’s team discovered the Flame virus, suggests this intersection is one of foreboding, explaining at CeBIT last month that “Cyberweapons are the most dangerous innovation of this century.”

Is he right?  More dangerous than the nuclear weapons they were intended to prevent the manufacture of in Iran?

More dangerous than Hellfire missiles zooming down from the skies of Pakistan?

I suspect it depends on your respective point of view, literally.  But there can be no question the cyberintelligence debate will heat up over the coming years.

Now that digital (and, often, very economically efficient, when compared to more traditional means) mechanisms can be used for the art of proven and productive warfare and espionage purposes, state actors will likely shift more investment into cyber territory, putting much more muscle into what had previously been the domain of fringe actors.

Such a trend could lead to the development of much more serious and sobering digital “agents” whose primary purpose — for espionage, for risk mitigation, and so forth — could ultimately be betrayed by Murphy’s Law of Unintended Consequences.

The virus intended to destabilize the spinning centrifuges in Iran could spin out of control and instead open the floodgates on a dam in China.  Or so goes the fear.

But perhaps the fears are not without some justification?  If you don’t know who you can trust in the digital milieu…or, worse, if your systems don’t know who they can trust…how can you trust anyone? Or anything?

Just overnight SecurityWeek posted that Microsoft had reached out to it customers and notified the public that it had discovered unauthorized digital certifications connected to the Flame virus that “chain[ed] up” to a Microsoft sub-certfication authority that had been issued under the Microsoft Root Authority.

If such certificates can be co-opted by the “Flames” of the world, and appear to be legitimate software coming from Microsoft…well, that’s a fast and slippery slope to cyber anarchy.

As SecurityWeek also recently reported about Flame, yes, the short-term risk to enterprises is low.  But Flame “demonstrated that when nation-states are pulling the strings, they have the ability to repeatedly and significantly leap ahead of the state of the art in terms of malware.”

As state-actors raise the table stakes by developing more and more sophisticated cyber intruders, they will, in essence, be raising everybody’s game.  These virii don’t live in a vacuum — they will be gathered by the non-state actors, hackers white and black hat alike, then deconstructed, disassembled, and, potentially, improved upon before being re-assembled and unleashed back into the wild.

So what’s the answer?  Unfortunately, there is no single cyber bullet.

Constant vigilance, education, monitoring, and adaptive learning will be mostly required, in order to both keep pace with the rapid evolution (or, as the case will likely be, devolution) with these digital beasts, and enterprises everywhere would be well-served to step up their Internet security game.

Finally, let’s not forget that state-actors aren’t just looking to instill damage — many are searching for valuable intellectual capital they can benefit from economically.

That alone is more than enough justification for enterprises to have a more comprehensive cyber intelligence strategy.

In the meantime, let’s just hope the next Flame or Stuxnet doesn’t lead to a more disastrous scenario than knocking out a few centrifuges in Natanz, one that starts to make a Michael Crichton novel look as though it’s actually coming to life!

Written by turbotodd

June 4, 2012 at 3:59 pm

%d bloggers like this: