For a group like mine that writes software for modeling and other computationally intensive code from scratch, moving code from early 2000-era single CPU approaches to SMP approaches (or even clustering) is a painful but probably required undertaking. We've discussed it, but the business cases (or the requests in the way of $$) have not materialized so we didn't fix what wasn't broken.
But, I wonder if we should just skip all of this and move the model to take advantage of the flexibility of cloud computing. Enterprise providers are convinced Cloud will scratch many itches and announcements like this ( [1] [2] [3]) Sun and IBM's investment in Cloud, and Cisco's leap of faith into building server fabrics because, "[cloud computing] is the future of the data center. It will evolve into clouds and change business models forever" make the question compelling.
For us in the niche world, we have to ask does the investment make sense? Where does Cloud stop being useful? When is it best used? When should you not use it?
No comments:
Post a Comment