February 02, 2006

Mining the Mind of Steve Krause

Back in the days when I was complaining that 30 year olds were running corporations, the gang of us at Hyperion eCRM had to deal with the equation hype + money - common sense = success. At least that's the way it was in the days of irrational exuberance. As one who always likes to review history in the light of renewed appreciation or scorn, I find it fascinating to find out whatever happened to.. along the lines of my career path and pointed pontifications. It's for that reason that I started my little jag on Xerox History. In the meantime, as the F500 slouches towards real security, BPM and data mining, I have fun digging up data on my own industry.

Steve Krause is my latest find. He puts up a nice practical post on Last.fm and a competitor that I never heard of or paid attention to. It fits rather congruently with my 'Do As I Say' theory. Note that the ultimate judge of the appropriateness is aesthetic consistancy, an entirely human creation.

It also turns out that Krause was a competitor at Personify. It's amazing that they were able to burn through half a billion dollars. If those happy days ever come back, never give a sucker an even break.

Posted by mbowen at 02:22 PM | Comments (2) | TrackBack

January 06, 2006

Dynamic vs Transactional Engineers

Every once in a while I kick something over the Drezner's site. Especially when he writes on outsourcing which is one of my pet peeves. As a highly technical architecture level consultant that does hands-on work, I really don't blog about it as much as I probably ought to. I've thought that I could run multiple blogs and keep all those worlds from intermixing, but I think I may begin to blur that firewall too. So on the matter of the quality of engineering and technical talent from the emerging second world, I had this to say:

I think that this distinction between dynamic and transactional engineers is very useful and accurately describes what I see in the software industry as well.

Even when chinese and indian programmers are on staff in American companies, there are notable differences. You never quite know what you're getting until you sit folks in a room and start talking about the systems to be built.

Enterprise systems that are to make a difference in the productivity of the target customers are notoriously difficult to assemble, even when using simpler standard technologies that procurement departments are demanding be offshored. That is why companies like Accenture continue to make money in the American and international markets. The skills of management consultants that can do a tight handover to technologists are in high demand, but even so these are applications that tend not to be robust. Anything that takes more than six months to build will suffer from changes in the business environment, turnover in personnel and integration with other systems that themselves are being changed.

Again, the allure of cheap labor in this area is that SQL is SQL. Not necessarily so. It's actually getting more complicated to do these applications properly, primarily because of an attitude the 'best practices' can be built into every application. This means that a lot of abstraction of problems is done, and a number of experts who don't do hands on work are employed. All this is done at the expense of homegrown (meaning inside the client company) experience which is the great hidden expense of outsourced systems.

It comes down, in my view, to a decades old clash between management philosophies. Deming vs Hammer. The Deming method says to evolve the way people are working with technology and business processes. That's highly integrative and evolutionary. It means you have to do a lot of listening and translating. The Hammer schools says, throw out the old and make everyone start from scratch. That's re-engineering. Companies that get re-engineered outsource and offshore better than companies that evolve. Companies that evolve are more productive because the culture of the company teaches everyone what the focus of the business is. They can be more nimble, but there's a steep learning curve.

I'm from the Deming School (an old Xeroid from the McKinsey makeover under David Kearns) and have been in the enterprise sofware business for two decades. The Deming way is harder, and it's often against the best interests of management and technology consulting companies, not to mention software vendors, to evolve a good company to great, but the real loss is that so many American corporations are literally outsourcing their own quality improvement. They don't want to grow their own MBAs, they want somebody else's. This dependency is what both depletes American talent, and keeps consultants like me in new BMWs.

Meanwhile over here at Nissan, things are beginning to get very interesting as deadlines start to loom. While Toyota is poised this year to make 100k more cars than General Motors, I'm finding that the systems that are being built in the wake of Sarb-Ox may have a long-term payoff for American businesses. Because over here in this Japanese company, typical of a large number of companies I've seen, the spaghetti and spontanaity of financial planning and accounting boggles the professional mind. In my close circle, we have a term called 'The Official Ass'. That's where a lot of numbers are pulled from. The number of companies that do real demand planning are few and far between. Demand planning is very difficult to do with most kinds of accounting systems that companies have, and so companies pull numbers out of a collective hole in the ground. That is particularly typical of non-financial companies - that is companies whose business is not primarily the management of money. When your financial staff is considered overhead... well.

It's one thing to understand how to build this database and that database. Sure that stuff can be outsourced. Should it be? It really depends on whether you're building system to be building systems or building them to improve the way a company is run. The latter can only be done by dynamic engineers. I like to think of myself as one of those types, but its not very often that we are called to do all that.

Posted by mbowen at 05:47 PM | Comments (1) | TrackBack

June 15, 2005

The Grinchbag Theory

(soon to be translated into Suessese)
The story of the grinchbag is simple. It is an IT parable based the story of the Grinch Who Stole Christmas.

Whoville is where all the users live, and throughout the year they gradually pile up their little Who requirements and expect at the end of the year that the IT Santa Claus will fulfill all their desires.

Now the Grinch may be anybody in this tale, but the point of the grinchbag theory is for whatever reasons, all of the goodies and wishes are accumulated into one giant bag and taken away from Whoville. And whenever you put everything into one grinchbag, it's awfully hard to move all at once.

So if you're holding the grinchbag at the top of the hill and you change your mind for any reason, you're going to have to grow your heart three times and attain superhuman strength to save Christmas. The fact of the matter is that you won't. You can only hope that the Whos will sing Dahoo Dory without their goodies, because the only thing superhuman strength does is save you from being crushed by the grinchbag.

The moral of the story is to spread goodies throughout the year and not try to gather up everything in one grinchbag. The Whos are going to sing with or without you. You really aren't in charge of their spirits, which you can't possibly understand because you don't live in Whoville.

Posted by mbowen at 06:05 PM | TrackBack

April 28, 2005

OLAP Blog Integration

I'm also about to crank up the volume on getting some BI & OLAP bloggers coordinated. I think it's sad to see a bunch of stragglers out there with no trackbacks or comments. I'm going to play cat herder and see what I can come up with.

Part of the problem is that there isn't a good back and forth with industry experts. Rumor has it, however, that John Kopke is a blast in front of customers. I know he understands the technology and I used to work with him at Pilot Software back in the days. And yet there are a lot of people trembling with fear at guys like Nigel Pendse. Now there's a showdown I'd like the blogosphere to witness.

I've been neglecting the dialog because I've been downmarket for a while and basically nobody's talking because smaller companies don't have time to listen. But now that I've come back to the state of the art and I see what Hyperion has been doing in terms of rolling out new products and upping the ante and expanding the scope of BI'able applications, there's plenty to say. The question is, who's saying it? I certainly would have been, but that's not what Cubegeek turned out to be. Let's give it a shot in this brave new world.

UPDATE
Introducing The Cubegeek Blog

Posted by mbowen at 03:12 PM | TrackBack

April 23, 2005

Master Data Management

One of the most exciting pieces of software to come down the pike in many years is one picked up in a recent acquisition by Hyperion Solutions. It's one of the reasons I have to be fairly jazzed about the kinds of systems I'll be able to build in the coming months. Formerly called Razza, it's Master Data Manager.

If you had asked me a month ago what was the best way to make money in the Enterprise Computing business, I would have told you Master Data Management. I wouldn't have used that precise term, I would have probably said something like this:

One of the biggest problems for me, in building systems with the tools I have is always the political problem of getting all the people talking the same language. A significant reason why DW initiatives fail is because the metadata is all over the place and everybody spends too much time chasing the data down rather than analyzing it. All I need are my tools (speaking of Essbase outlines) and then I get functional people and technical people speaking the same language, because everybody can see how the numbers and entities roll up. The reason Informatica is making all kinds of money in this space is because they promise to solve this problem.

Well here's what IDC says.

Master data management is a challenging, long-standing problem. But recent attention to business performance management and compliance represent a new opportunity to deal with the issue in a way that can improve both information accuracy and organizational agility.

With Hyperion's MDM, I believe the problem has been solved. As soon as I get a copy I'll get deep into the details, but basically this is a collaborative tool that will allow enterprises to manage all of their dimensions, whether they change slowly or quickly, back through history.

Imagining the worst spaghetti possible, a partial migration between ERP systems without the benefit of ETL, a MDM Server would get everyone on the same page. How many times have I had people squawk about the complexity of Peoplesoft Trees and complain that their reporting systems use one drill down and their interal reporting systems use another and that the Business Objects Universe was painstakingly coded with another? And how many times have I had to be the one to reverse engineer all that rot and put into my systems? Too many to count.

I'm going to have a field day with this tool. Believe that.

Posted by mbowen at 01:39 PM | TrackBack

April 22, 2005

Tableau Software: Visualization Comes to BI

As part and parcel of my boredom with politics and lack of inspiring things like juicy battles overseas with which to occupy my mind, I will indulge a bit of the geekside. Something tells me I ought to do so elsewhere, in fact I probably will although I really am divided about all that with regard to implementation etc. I like to keep my politics and my profession in separate parts of the plate.

Anyway the reason for the excitement at the current moment is my discovery of Tableau Software. This is the sweetest front-end I've seen since I first laid eyes on Wired for OLAP. It's a pure visualization tool that has about as much wow factor as anything in BI. It could singlehandedly destroy the concept of prefab dashboards, however chances are that's not going to happen for some time.

What's best about it is that it is pretty simple to use, give dramatic presentations which are very rich and informative and it uses a the single best realtime fat client on the planet as its backbone, Microsoft Excel. Tableau hasn't wasted a bunch of time and energy on the n-tier fantasy, instead they have made use of both corporate inertia and Moore's Law. Guess what ladies and gentlemen, PCs can handle big fat clients, and this one delivers.

What I'm doing these days is ramping up my own data warehouse on the home network, which has about 9 boxes right now in various states of function. I've managed to keep various scraps of master data from a zillion clients over the years and I have a fake data generator that I built from scratch. So there's about to be a world of research I'm going to be able to accomplish here at Lab 107, (a wholly owned subsidiary of Metro Decisions).

Hell, I might even snarf some interesting data from blogs and other public data sources just to show off. Stay tuned, the geekfest is about to begin.

Posted by mbowen at 11:34 AM | TrackBack

April 19, 2005

Pirate Code

I have heard some way out schemes in my life, but this one takes the cake. A pirate ship off the coast of Los Angeles with H1B failures coding enterprise systems? Yeah right.

The demand for highly qualified programming staff is high, but not that high. What people forget is that the trends toward the demand for highly competent software staffs is growing, and the demand for highly engineered software is diminishing. That is to say that Open Source will grow, and it doesn't matter where that comes from, but that implementing software *on site* is the toughest and most demanding job in the industry. It is the equivalent of changing a tire on a moving car. The moving car is the business of the enterprise, and they're not going to paddle off to some boat in order to get their specs.

I'm betting that the market is going to get tougher, not easier, and that the necessity of having personal contact is going to be greater, not less. For all the marvelous things we do with software, our interactions are only going to get more complex. It is with software as it is with law - even though it belongs to everyone and is sortof open source, when you need yours, it's all about intimate contact with the squad who is going to take you through it.

For software engineering, I can see that such flighty ideas might have weight. The question is, what kind of character wants to work on a rusted out ship? I think the best programmers are going to want to drive nice cars and get dates...

References

  • BoingBoing

    Posted by mbowen at 10:57 AM | Comments (3) | TrackBack
  • September 29, 2004

    More Work Talk

    After, or during a long day slaving over hot algorithms and queries, the last thing I want to blog about is work. In fact, I've probably talked more about crap I have absolutely no clue about than things that I'm probably world class in, which is OLAP and data warehousing.

    These days I'm getting a little fed up with having this blog be of no use in spreading the word and working with work. Since my mind is made up about this upcoming election and I really don't think that people who are undecided should even bother, I'm not going to take up much more space on that tip until after the election. Plus P6 and I get to kiss and make up. So starting today I'm going to start blogging about OLAP and DW, which may be a very uncompelling subject, but...

    But I have work for people who are interested in this very cool field and you all need to be aware that I have a little bit of moving and shaking ability. My little business, Metro Decisions is ready willing and able to start responding to the interest in the field, and quite frankly I am really tired of telling the recruiters and agents that call me every week: "Sorry, I'm booked and I don't know anybody who is available to work now."

    If you're interested. Start talking. Right here. Right now. There's lucrative work sitting undone and several industries which are untouched by some truly cool technology that I'm trying to bring. My other site, CubeGeek.com can use a little assistance so I'm goosing it here. You can see all of the jobs referrals that I get. I will continue to post them there and perhaps refer to them here. This will make Cobb a bit more boring, but it can lead to economic happiness.

    By the way, I'm also getting into more traditional web development stuff as well, so if you're ready to talk that, let's do it.

    Posted by mbowen at 11:44 AM | Comments (5) | TrackBack

    September 03, 2004

    The Hatred Continues

    Those who hate Microsoft use Linux.
    Those who love UNIX use BSD.

    -- Anonymous

    I can tell that the headaches are about to start. I can feel a new brace of wind from the ever squabbling factions in the mind-numbingly tedious OS Wars. In today's installment, the open sourcers are crowing that some new winky blinky has been dropped from a future release of Longhorn, an MS OS version that's somewhere on the horizon to be released in the next year or two. Not only that, there's scoffing about XP's new Service Pack and bragging about Apples new iMac.

    As the bored chicken said, cockadoodle-whatever. Is there anything really new in computing?

    Posted by mbowen at 08:49 AM | Comments (2) | TrackBack

    July 21, 2004

    IBM Taking BI Seriously?

    Within the past week IBM has acquired Alphablox and added a new Microstrategy bundle to their offerings. Is this another attempt to forestall a Yukon invasion, or are they just filling up company checkboxes?

    My guy on the inside is as confused as anyone but his angle is that it is just another way for them to fill out there Websphere story. Alphablox is very labor intensive and if you know it, it's a good way to bill consulting dollars, if not build a very efficient system. I don't think the guys at Cognos or Business Objects are shaking in their boots. However, for those big huge 'we don't really know what we're doing but we sure do have a methodology" type projects, there will be a lot of Rational guys filling out forms. That could be a good thing, I suppose.

    Posted by mbowen at 10:31 AM | TrackBack

    July 13, 2004

    OWL & Metadata Management

    There's an interesting nexus between some ideas Cameron and I dreamed up when thinking about XRepublic and this new notion of OWL and worm's eye metadata.

    The point is that to harness the power of distributed thinking in assigning value to metadata on a subject matter, some categorization is necessary. You want to avoid re-inventing the wheel, but you don't want valuable one-offs to go unnoticed.

    In the context of a bottoms-up deliberative space whose purpose is the generation of consensus, I came up with the idea of a Taxonomy Hike. If you care about your idea, then you can walk the distance it takes to attach it to an indexing scheme that people will look for. For example, if you want to weigh in on transracial adoptions, you might want to hook that up under the Parenting category. Cameron suggests Hipboning that subject in a dynamic way such that you can gain valuable insights by taking tangents that don't work in a hierarchichal taxonomy.

    The following is from the XRepublic talk about the Wonk Path.

    Taxonomy Hike
    At some point, the Wonk needs to do some research. The research that is done online in the context of discussions and artifacts of the XRepublic necessitates a Taxonomy Hike. The Hike requires the Wonk to show a demonstrated effort to understand the context of similar discussions. Since there will be a Master Taxonomy which incorporates all of the discussions, the system should deliver the most 'considered' objects relating to the subject. This assures the Citizens that wonks have done their homework. After a wonk has taken the hike, she may be presented as a wonk and begin crafting a resolution, litmus test or other partisan artifacts.

    Hipbone Room
    The Wonk at any time in the process should be able to free-associate in the research space. In the Hipbone Room, some gaming is done to connect different concepts. An excellent introduction to Hipbone Games is given by Charles Cameron. As Charles Cameron has showed me via Hipbone, one of the most attractive things about dialog is that sometimes you happen upon a connection, a vibe as it were, which is completely unintentional but gives a great deal of insight to another class of problem or issue. XRepublic is more of a funnel for weighty arguments as determined by peers and is such directed towards a specific purpose. But it is also the aha of a tangential discovery that can lead to greater insights.

    How indeed does some matter of Niger relate to a resolution of war? Nobody, but nobody was discussing that in March of this year when it came to the top reason pro or con. Nevertheless, something tangential in a different house, perhaps talking about Northwest Africa might be hipboned to the conversation somehow. The best we could come up with was the idea of a 'hipbone room' in which people free associate and build weblike structures in which seemingly disparate ideas could be linked to each other for another layer of contemplation.

    In the context of BI and decision support, the need is the same. If you notice an interesting pattern of consumption by eyeballing your POS in a retail business - say you find a lot of men buying Diapers and Beer, you need the ability to flag that notion. There should be some way to hike the taxonomies of the global data warehouse from the data supply side which has a quick worm's eye facility onsite. Doing this before the data gets assimilated into the top-down paradigm of the DW might even obviate the need for data mining. At the very least it embeds human clues into the system.

    I'm rather unclear about exactly what OWL is, but this page suggests it's a wider purpose facilitator for creating ontologies. I would expect that ontologies are broader than taxonomies, but make more sense from the worm's eye view. After some time, people might agree that a Diaper/Beer ontology needs recognition and acceptance into the formal Taxonomy.

    How's that for a groupthink killer?

    Posted by mbowen at 01:46 PM | Comments (1) | TrackBack

    July 07, 2004

    Worm's Eye Distributed Client Side Aggregations

    I just had a flash of inspiration. One of the problems with the cube paradigm is as Neil Raden says here in his Netezza white paper. It doesn't take advantage of cheap hardware scalability and it abstracts the most educated guesses of strike teams that don't always strike true.

    What if we reversed the paradigm of query aggregation with the understanding that end-users have significant hardware resources at their disposal? The idea here is that an analyst will poke at peek at detail and then come to some preliminary conclusions given their worms eye view of the business. Therefore you give them very specific access to real-time data in their area of expertise and then allow them to speculate about the effect on the business. In other words, rather than using standard top-down metadata to pre-build aggregations for marting, allow ad-hoc aggregations using bottom-up metadata decided on by some collaboration of worms-eye views of the business.

    Posted by mbowen at 10:39 AM | Comments (1) | TrackBack

    June 23, 2004

    Arboretum

    'Arboretum' is the code name for a project that I've been needing to do forever. This is a programmer's tool for understanding and maintaining hierarchies of all sorts, but in particular those associated with enterprise software.

    Every time I go to a customer I have a huge headache in getting their charts of accounts straight. It's such a simple concept that you would think that somebody would have made this kind of tool before. But no.

    So I'm going to specify this and sell it into my client base. The trick is that it will support plugins that talk to all of the major packages. If I can do this, I'm going to make a fortune. All I need is time and a programmer.

    Posted by mbowen at 02:47 PM | TrackBack

    May 14, 2004

    Watch And Learn

    I am coming around to the realization that my field of software is an Oldsmobile. Part of this realization has to do with the grid computing infrastructure I mentioned last month. Part of it has to do with the insanity of VB.NET which I am trying to digest. (more on that later). A good fraction of it relies on my absolute ignorance and fascination with the tools, techniques and outputs of the CG trade (Do check out Rockfish). But a lot of it has to do with watching my own kids and monitoring their expectations.

    As Stowe Boyd points out, kids don't care about snail mail. And one of these days, the tools of this trade are going to be a lot more sophisticated.

    Yesterday, as I returned Splinter Cell Pandora Tomorrow & Red Dead Revolver to the local Blockbuster, I tried to conceptualize a new set of visualization tools for Enterprise computing. When it comes to graphical interfaces, we are so very retarded in terms of the way we conceptualize the functioning of our businesses. Even the simplest videogame is worlds ahead of what most businesses use. The number of execs I have talked to who would pay an arm and a leg for stoplight charts and dashboard dials are legion. But no gamer over the age of 7 would settle for the lack of realtime, or subtlety in corporate accounting systems. As I staff up at Metro Decisions, I'm only going to take gamers. I decided that a month ago.

    A pal in NZ is starting up a company which appears to have broken through the OLAP barrier. It was bound to happen. Now that reality is at our doorstep, the next phase really has to kick in. And we will not sell it to the current crop of businessmen, but to the next generation which is happening now.

    I have watched the industry for a long time, and they squander resources. They make excuses the same way they make work. Young minds and egos have no time for such ossification.

    I argued with F9 yesterday. She says as a general principle the movie is better than the book. The subject was L'Engel's 'A Wrinkle in Time' and I wouldn't concede the point. But perhaps I should. What's in books that can't be digitized? It's always that the book covers things the movie does not. But we do have ways to visually and auditorially master the third person omnicient, and the craft will advance from here. What's expensive about shooting film in the current format can be overcome by videogame narrative. Kids do spend 100 hours in single games. I know.

    There's a great potential awaiting the right team of programmers. You may not have heard it here first, but this blog is to remind me of certain things as well.

    Posted by mbowen at 10:18 AM | TrackBack

    March 05, 2004

    Deep Variances

    Deep Variances
    M. Bowen
    August 2001
    ---

    Abstract
    Deep Variances are high level indicators of problems at a deeper level of hierarchy. It can reveal fudge factors and other low lying problems which might not otherwise be noticed at a high level. The effect is that it quantifies problems in a manner suggestive of the *breadth* of a particular problem in contrast to the *severity* of that same problem. Severity errors propagate normally up a dimensional hierarchy when they are sufficient, but a breadth indicator such a Deep Variance can give an early warning even when certain high level balances seen ok.

    Example
    For example, imagine a high level of variance volatility at a low level of a dimensional hierarchy. There are high negative and high positive variances. These 'balance out' in aggregation. So an individual considering overall progress might not think to look into the detail.

    Tech Details
    In Essbase, make the Deep Variance another member of the Scenario dimension, but don't code the formula into the member. The forumula needs to be calculated by script because there are two passes. The first pass is implemnted by the custom script. The normal 'rack and stack' aggregation of a 'calc all' is sufficient to create the high level numbers. These high level numbers will thus represent a *count* of deep variances as defined by the variance trigger. In this case, the trigger is (actual-plan < -2).

    fix (@LEVMBRS(item,0))
    "Deep Var"
    (if (actual-plan < -2 )
    1;
    else
    0;
    endif)
    endfix

    Note that in this case, 'Item' is the only sparse dimension. I don't know what might be the case for multiple dimensions.

    Posted by mbowen at 11:14 AM | TrackBack

    February 15, 2004

    On Data Mining

    Over at Crooked Timber, social scientists and economists take potshots at data mining. Over here in the Business Intelligence field, there's a lot more potential for it.

    As an added value part of a product suite that I am developing, I will be offering some data mining. Coming from the realm of enterprise computing, Data Mining has a mixed reputation, but that stems primarily from the difficulty it is to implement and maintain as a system with regard to the technical expertise required to properly interpret data and that required to weed through BS products.

    Of the many DW projects I've been on, there have only been a very small few who expressed interest in data mining and only one or two who have invested. Part of the reason is because unlike social scientists, those folks that I deal with have more real information than they need to make intelligent decisions, furthermore their business models are not immediately amenable to new discovery.

    In the first case, as is well known in retail, when you have millions of transactions at the checkout counter at your disposal, you already have more than enough information to handle most inventory and product profitability problems. Since the retailer's problem is pricing according to supply and demand, mining things like purchase affinity is only icing on the cake. For the most part, all the merchandise in the store is going to remain in the same aisles and there are only endcaps to change. So figuring out the 28 products to feature there (especially considering the market research already done by suppliers of endcap goods like potato chips and soda) doesn't require extraordinary precison.

    Despite the apocryphal tales of diapers and beer sold together by dads making a run, there aren't a great number of data mining success stories in retail. It doesn't make that much of a difference to the bottom line. Speaking of hearsay, a certain large retailer has confided that they have many many terabytes of data and it's difficult enough for them just to store it much less make mining passes over it for interpretation.

    In the second case, there is always a proverbial prophet crying in the wilderness about some problem with a company's business model. The example I love to give had to do with what actually happened in one of the biggest and best systems I put together back in the early 90s. At Philip Morris USA, the proud owner of the world's second most powerful brand behind CocaCola, there was some slight fear about the market share dominance of Marlboro. As I designed and built their tracking system which gave monthly market share numbers (when that was the most frequent numbers were published) with the aid of an economist and *the* statistics text, we programmed some modified confidence intervals. These told us that it was reasonable to assume that the newly arrived bargain brands would actually eat into Marlboro's lunch.

    At the time, such a thing as discounting Marlboro was practically unthinkable. PM had declared as much publicly and it was well known that if PM ever reduced the price of its premium cigarettes, it would spell the beginning of the end for the entire industry's legendary profitability. Considering all that, it really didn't matter what our fancy computer projections said the impact of 'Basic' and other generic tobacco products.

    Some time later, however (we like to think based upon the information we were able to show) PM actually did discount Marlboro. The stock dropped several percentage points and the industry swooned. Then people got over themselves and adjusted to the new normality. Nevertheless, these changes took place in spite of the psychographic data and the company's sense of the that data which said brand loyalty would survive price competition.

    I primarily think of data mining in the context of multidimensional analysis. 'Bucket Shaping' is how I will use it in my next application. Predicting which factors people use as customers is a dicey business, and it's reasonable to pay a marginal amount to gain a marginal edge. Honing that edge and finding the real cost benefit is no simple matter and certainly not used to the same ends of independently verifyable theoretical ends as with social scientists, but marketing is non-trivial work, and marketing managers do buy it.

    Posted by mbowen at 09:15 AM | TrackBack

    February 05, 2004

    MS Reporting Services - First Look

    They didn't make it pretty. They made it work. From what I could get from the demo this moring of Microsoft Reporting Services, Microsoft has a very big hit on its hands. Sell your shares of Crystal right now.

    There are a lot of interesting things to know about MSRS. The most important thing to know is that they did a very good job of putting some hot functionality into it so that it's not a joke product. Coming out of the gate, MSRS is capable of handling a good 75% of enterprise reporting requirements. They didn't show much on the security part of this so I might hedge on that a bit, other than that it appears to be a very competent product.

    Licensing is 'free'. Basically it rides on top of your SQL Server license, and I'm not sure how they enforce that in code, but there is basically no barriers to entry with regard to getting started. This means a lot of companies are going to start hedging their bets on purchasing products from the competition until their SQL jockeys get their hands on it. And guess what, SQL jockeys are just the intended audience.

    Although this has a translation doohickey from Access Reports, there is gobs of lovely SQL behind this piece of work. That means SQL jockeys and hacks will intuitively understand it and crank out many many briefing books and satisfy a lot of needs in short order. That language called SQR just became a useless skill.

    It's difficult for me to understand the MS upgrade path with regard to their licensing, so I'm not in a position to determine if it has a reasonable chance of upstaging current implementations of BO, Cognos, Crystal, and Brio. After all, most enterprise reporting projects don't originate from SQL Server. So somebody's going to have to pay to transfer Oracle seat money to grow up the piddly SQL Servers all over the place MS is betting will come into use. But since the learning curve seems deceptively shallow for MSRS, a lot of apps developers may very well jump ship.

    The wizards look fairly nice. After all, page layout is generally a no-brainer it's made just for wizards. There are very handy table, list and text objects to drag and drop around the design tool which allows you to preview your reports in realtime. Data is provided through XML / SOAP from .NET so as a data source you navigate to an http url on a local or remote server. It paints everything rather quickly and it isn't very difficult to see how you flip back and forth between the design screen and a QBE thingy very much like the one in Access to select your data. So I gather that this will allow a reasonable individual to develop reports rather quickly and efficiently. (You can use stored procedures too).

    The Reporting Stack has 4 services that sit on top of SQL Server only Repository. Rendering, Security, Data Processing, & Delivery. By sitting on top of SQL Server, it's going to have a very strong management layer built in. From what I can tell, you can manage Dev, QA and Production servers from one console and do migrations back and forth automatically.

    More Later...

    Posted by mbowen at 04:38 PM | Comments (1) | TrackBack

    January 26, 2004

    Sentimental Favorites

    Norm Geras has posted the results of his top movie surveys. This is the part where I criticize the rest of the world for not seeing things as I do. But it's probably a better bet for me to use it to get some DVDs for the collection.

    What's more interesting to me is investigating his tabulation methodology. What tools did he use? How often is he willing to do some of this kind of stuff, which is a non-trivial exercise given that he had to do a lot, if not all, the data entry by hand?

    It has long been of interest to me to put some of the technology I work with on the net for free so people can do this kind of analysis like the pros. Finally in the blogosphere I have found a set of people wonky enough to care. Thanks for the inspiration, Norm. Now can you mail me a copy of that spreadsheet?

    Posted by mbowen at 02:34 PM | TrackBack

    December 16, 2003

    Sour CRM Grapes

    I just discovered in an old Information Week magazine dated October of this year, that Seibel made $157 million in revenue from its [sucky] CRM analytics software. One month before my division at Hyperion was destroyed by [clueless] management, we were closing the deal to embed our software into that solution.

    Posted by mbowen at 03:16 PM | TrackBack

    December 02, 2003

    Eyeballing Steers

    Floyd gives us something to think about with regard to programmer productivity in the enterprise computing space. I tend to be on his side.

    Way back in highschool, I was told by my geometry teacher that the Four Color Problem had been solved. He was at once dismayed and astounded that it had been done with a computer and it wasn't an 'elegant' solution. But he, like most of the math community rather grudgingly accepted that some problems just may not have elegant solutions. Such problems, while not beyond human curiosity will require dilligence beyond human capacity. Thus was the domain of computers.

    I happen to believe that the problem of software programmer productivity isn't really a problem in the traditional sense of something that can have an elegant solution, but in the same sense as the Four Color Problem cannot be solved by one single and elegant line of reasoning. Peter Drucker famously said that business management has only just begun to be studied. This rather stopped the endless questions about 'which method is best' but not the effusive praise.
    Business computing is what I call a 'significantly bitty problem'. It can only be solved empirically by going down each twisty alleyway. Managerial Computing (Keen, Zuboff) has only just begun to be studied.

    The problem with business, just as the problem with business computing, is deceptively simple. Generate profits. Of course there are virtually infinite ways to do so. Tools that assist businesses in their aims are rightly daunting not simply because the underlying technology shifts from time to time, but because the goals and methods of business are constantly changing.

    The purpose of managerial computing is to serve the purposes of management which is actually a bit more complex than the purposes of investment brokers. Whereas the data for investors vis a vis Bloomberg-type systems and models is hard, such a great deal of analysis has not been done in every sector as regards the soft tangibles which could be effective measures of business performance. It's easy to read a momentum chart for a stock which has daily prices and make predictions. It's not so easy to eyeball a herd of yearling steers and guess the price of beef. Eyeballing steers is where much of the early science of business management is: still analog. So there is a great deal of evolution before the world of enterprise managerial computing has identified the methods to assist in the majority of enterprises.

    Simply because Siebel has made billions selling CRM software to all the telemarketing companies you hate, doesn't mean the state of enterprise software has significantly advanced. One of the lessons we should have learned from the dot com implosion was that there are indeed thousands of business plans and models that do lend themselves easily to digitization. But only a few of those are actually profitable. Yet there are millions and millions of others that still exist, untouched by that revolution and its software tools and methodologies.

    Let us also not forget how long it took for GM to understand and then implement supply chain strategies. Decades. Such management philosophies, still not wholly explored, certainly not ubiquitously applicable across industries, will in time generate reliable and generally accepted performance metrics. Whatever state the software tools industry is in with regard to its management of programmer resource allocation, language choices and distribution of compute power, there will always be an unelegant alley to travel. That is the space between what business conditions provide, what managers see and what systems can be built efficiently to aid in supporting that vision.

    This is not a technology problem.

    Posted by mbowen at 09:16 AM | Comments (1) | TrackBack

    December 01, 2003

    A Warning to Hyperion

    I'm converting. ..taking the long hard slog up the curve of c# and MDX. I'm taking the Microsoft path more traveled, and going downscale to midmarket.

    More opportunities. Decent tech. Harder work. More's possible.

    In two years, me and people like me are going to be very dangerous to Hyperion's core business. Why? Because we'll have been operating at a level deeper than Hyperion is willing to open up their products. Our bailing wire and duct tape systems will be functionally equivalent to Hyperion’s shrink-wrapped stuff.

    It'll be scary.

    Check out these two:
    Http://www.mdcbowen.org/cobb/archives/001147.html

    Http://www.101-280.com/archives/000185.html

    Large companies are outsourcing Essbase level coding to third parties and overseas. Architects and head DW designers are the only ones safe today, the tools around MSAS will evolve to the point at which one or two folks can do the whole custom project.

    The question is whether or not the enterprise software sales business, with the high overhead of direct sales, can survive. Companies like HYSL will have to make huge deals with big companies and sell ubiquity on the inside. I don't think Hyperion will be big enough to do that in two years.

    As I implied, cost of ownership isn't going to be a compelling argument to companies that outsource. So if Essbase supports MDX, it better lower the price. And if HFM and planning don't support .net, they better open up a very attractive API.

    Essbase consultants are starving out there. The world hasn't learned the development methodology. The labor market of Essbase developers are hedging their bets. Almost everybody I know who was an Essbase head 3 years ago has added a second technology to their toolbelt.

    Six months ago if you entered 'Essbase' into dice.com, you'd get 300 jobs. Today its 102.

    Integration is a huge issue, that's why there is lots to fear from Yukon.

    (also Floyd)

    Posted by mbowen at 08:00 AM | TrackBack

    November 25, 2003

    A Light Rant on MDX & MSAS

    MSAS seems to have been designed with no applications in particular in mind. Rather as a platform to be all things to all people. A little bit of everything with an emphasis on ubiquity. I haven't had the opportunity to see it evolve, but I see what it has become and what it is.

    The great difficulty in the OLAP wars is a dearth of developers. There are very few people who have learned and mastered it subtleties. What strikes me as particularly true is how mindful the Microsoft designers have been about this fact. They have designed a multidimensional query language with all the earmarks of being strictly utilitarian which is to say without the grace of having a clear purpose in mind. Therefore it is graceless and clunky. With enough force however, it appears that it can be made to do any number of things, and those things it cannot accomplish it leaves to the infinite scope of the Microsoft world.

    So it is the visual basic or ASP or dot net or Transact SQL programmer who is most encouraged by this fact, not the current expert business intelligence professional. Thought we have struggled with the arcane languages and odd APIs our entire careers it is with some disdain that we approach the tortured syntax of MDX.

    The learning curve of MDX is legendarily steep. Apparently it takes at least a year to master. This means that inevitably, one's first project is doomed to simplicity if not a worse fate. Even an old pro such as myself is daunted by MDX, mostly because it is overloaded.

    In some ways it can be said that MDX is a developer's dream. It is not only a query language, but a procedural and control language. It can control with certain extensions, not only the content but the format of the content. It's probably larger than it needs to be and since it controls so much, the tools that employ it must.

    The stalwarts at ProClarity have excelled at and mastered nothing so much as the parsing of this beastly tongue, and as such parsers struggle mightily with those of us who are sloppy with syntax. Their VP of R&D hosts a puzzle page. MDX simply invites verbose solutions to simply problems. It's codey. Therefore there is much to parse and much to learn with probably some pitfalls. We’ll see.

    I’ve got Spofford’s book and am getting into these matters more closely. Let’s see how right I am a year from today.

    Posted by mbowen at 08:16 PM | Comments (1) | TrackBack

    November 20, 2003

    The Beauty of MSAS

    Today's class got us into the arcane art of dimension building with MSAS. Imagine my surprise when I find that they have a dimension type of Bill of Materials. In all of the seven years I have been building databases with Essbase, that was the only aggregation problem that I couldn't solve. With Analysis Services, it's handled very sweetly as an attribute of parent-child builds. Not only that, you can eyeball aggregations just in the dev studio just to make sure.

    I am also finding it rather nice to see that there aren't many features and terms which aren't easily translatable into both worlds. You really have to know something about the internals of each package to realize whether or not you are getting an equivalent feature. So I am seeing why much of the arguments for or against each gets deeply into the 'that depends' category.

    What I have yet to see is how MDX plays into matters of customization, and of course I have yet to deal with any complex calculations. That comes tomorrow. So far, so good.

    It is worth noting that one doesn't seem quite as much in control of what is stored vs calculated on the fly without the extensive use of partitions in MSAS. What effect partitions have on accuracy (I've heard there is some) and complexity have yet to be determined. But it is nice that one may, ahead of time, estimate the percentage of the database that is aggregated. What I've seen in laptop size databases is no indication and most everyone concedes that Essbase is a faster and more powerful aggregator but it is hard to argue with the possibilities of remote partitions accessed through OLE DB.

    Posted by mbowen at 03:56 PM | TrackBack

    The Agony of DTS

    I've always like DTS. That's because I've only ever had to use it for fairly simple tasks. If I needed triggers and multiple conditional operations and whatnot, I've hand coded it in ksh or Perl. I've also had the protection of the Rules Editor for Essbase that allows me to visually program column joins, appends, substitutions, splits and order rearrangements. Those days are gone. I'm going to have to make DTS do some heavy lifting.

    In order to manage the metadata for dimension builds and data loads into Analysis Services cubes, DTS is going to be the master of all tasks. I started some of that work yesterday in class. Can you say tedium? At first, I was just giggling at the kind of billing I'm going to be able to do because such things will take so much time. And yet I heard others griping with more experience doing certain things with SQL Stored Procedures. Is it just me or does passing parameters to a query language and then debugging it with breakpoints sound like a perversion? I suppose if it's going to be a language, one might as well, but I've been doing query stuff through APIs for the past seven years, and doing my ETL through hand coding. A double reversal.

    The good news is that DTS itself is supposed to be transformed with the arrival of Yukon. They are actually going after Informatica, Sagent and Ascential. That's all good as far as I'm concerned. It can only get better.

    Meanwhile, I'm trying to figure out how to encapsulate Perl into its processes, you know what I mean?

    Posted by mbowen at 07:37 AM | TrackBack

    The Road to Redmond

    I'm embarking on a change in direction away from perfection towards the economics of the light manufacturing and mass production. Instead of crafting the biggest, best most efficient, cutting edge and brilliant products, I'm going to build things with Microsoft tools that work.

    This means I have to put .NET, C# and SQL Server at the center of my universe and start learning how to use them. In some ways this is like a midlife crisis as well as a crisis of confidence in the thought processes of corporate IT. Whatever. The decision is made and I'll chronicle the journey here and over at Cubegeek.com, my technical site.

    Even though the enterprises I'll be working with will be smaller, I'll still call it enterprise computing. Furthermore I'll be a bit more enterprising. Let's see if I get squashed in the process.

    Posted by mbowen at 07:28 AM | TrackBack

    September 15, 2003

    The Big Move

    I'm not sick of Essbase, but I'm no longer going to try to afford the luxury of ignoring Analysis Services. So I'm going to be doing some conversions and making a journal of it along the way.

    The first step is getting all my ducks in order. I've got three machines. First is my laptop running Win2K Professional. I just have the standard Essbase Excel client here. The other is Mars. Mars is my big NT box. It's got about 512MB of RAM and a fairly big disk. On that I have DB2 Personal edition, an old Oracle footprint and Essbase 6.5. As well I have a newly installed version of SQL Server 2000 with AS running. The third box is an old Dell using RH 9. I'm primarily going to use that to generate fake data if I need to. I've got a bunch of cool scripts for that purpose.

    On the laptop, I also have DB Designer 4, a Java equivalent of Erwin. Maybe I'll use it, probably not. We'll see.

    I'm firstly going to cruise through an MDX tutorial. I found a good one here. I'm also going to get an eval copy of Temtec for my front-end. I may get a copy of ProClarity too.

    Posted by mbowen at 02:50 PM | TrackBack

    June 29, 2003

    my.super.com

    The project I've been working on is rather amazing in several ways, but the one way I want to comment on is the demand that it has put on our database servers for some very processor intensive work.

    Once upon a time a company named Britton Lee did with the client server model something that has almost been forgotton, which is to make server hardware task oriented. Most computers we love and use are general purpose machines. And although the chipsets in personal computers have evolved to make their motherboards very attuned to paying attention to human needs, most chip architectures in the machines we use are not designed with certain software tasks in particular in mind.

    This is odd considering how much of the business done by large databases is predictable. I am confident that the architects of DB2 and Oracle have optimized their offerings to work in highly efficient ways on every computing platform for which they are offered. But this still doesn't change the fact that more often than not, our appetite for speed in processing queries that matter to us exceeds the hardware available for the task. Wouldn't it be nice if we could just push a 'turbo' button and reroute our task to the really big iron instead of the measly $500,000 8 way box we are using?

    The alternative to having a real hardware database server is to have a larger general purpose machine available to handle very large tasks on demand in a shared pool. This is the attractiveness of the promise of grid computing. In specialized areas, such as datawarehousing the market possibilities are great.

    Recall that only a decade ago, most of the corporate appetite for marketing or sophisticated business intelligence data was outsourced to service bureaus. Often a consortium or industry segment would provide data collected by individual organizations and submit them to a group like MSA. MSA would then process that collected data for a fee and resell it in a digested or expanded form back to the industry.

    But since the rise of client/server and UNIX in the corporate sphere, IT departments have been taking back that market, upgrading and expanding their own compute resources and building their own systems. Speaking from my own experience, most of the work in this space has been handled by high paid consultants and consulting organizations. Implementing these systems has been difficult, but the appetite remains great. Enabling companies to close their financial books and distribute results globally in 5 days instead of 45 is a common goal in large organizations. This puts enormous stress on systems at peak periods.

    Handling this cyclical demand by leasing space on mammoth systems can prove to be very efficient for purchasers of IT infrastructure and grid systems providers as well. This is where we should expect to see the promise of grid computing delivered.

    Posted by mbowen at 08:51 PM | TrackBack