Quovis Consulting

Funding Development and Driving Demand

  • Home
  • Technology Funding
  • Demand Generation
    • Engagement Model
    • Acquisition Model
    • Nurturing Model
    • Marketing Automation Tools
  • About Us
  • Contact

Speedier Data Analytics is Not a Panacea

June 19, 2014 By Alex Grgorinic

Flash memory is rapidly moving into the data center market as was recently reported in BusinessWeek. Now that it is possible to collect more data, more analyses can be performed. In a continuously changing world, it means the data is continuously changing as well. And we can’t be looking at the same data forever, because it is sure to become stale and lose its value. So companies are anxious to process that data faster. And so we are seeing quite a rapid migration of flash memory into data centers, in order to serve up that data faster.

The Hadoop platform has brought distributed computing power for faster number crunching with all of the parallel cores being put to work. And CDNs have enabled providers to minimize the number of network hops that data streams have to travel through in order to serve up web pages with minimal latency. But now we have bottlenecks that may be arising just in pulling large data sets off those spinning disks. So there is a move to Flash memory, despite the premium price.

This really underscores how markets are working today. We need data to support operational decisions. But that data is changing. So we need to make our calculations on what happened and measure what changes we are seeing. It is always about trying to understand, “Why is this happening” and “How should we respond”. If the context of what is happening can be isolated, our understanding will improve. So whether you need to make decisions about: inventory management, production orders, sales support staff, product mix, sales promotions, geographic hot pots, or correlate results to outside events, you need to analyze the data to understand.

But, are you a scientist or engineer? This is about the mindset you bring to how you go about identifying, gathering and acting on data. Following the science path, you may indeed end up with an insatiable desire for data. And you clearly don’t want to end up in a state of paralysis-by-analysis. You really need to follow an engineering approach, where you need to build a process that is good enough, and can give you actionable information. Approximations and assumptions are what make this possible.

One thing that history has taught us is that it is all about adapting to change. Those who adapt successfully will be those who are in a position to prosper. The absolute key here is that you acknowledge the need to analyze the results and changes that are happening in your business. It is then vital to determine the measures that matter to your business, and act on them.

But, no data set is perfect. And, no model is perfect. You can continue to arm yourself with more data and more data analytics tools. But you need to assess to what extent you are improving. Just as the high frequency trading industry is coming to understand, front running will only get you so far. At some point, you have to revisit how you are creating value in your business, and whether you are still doing the right things. Just digesting data faster is not a panacea.

Filed Under: Business Model

“Lolly Wolly Doodle”

June 12, 2014 By Alex Grgorinic

As playful as this phrase sounds, it is the name of a company. And it is anything but fun and games when you listen to Lolly Wolly Doodle’s story. Lolly Wolly Doodle is an e-commerce business that offers customizable children’s clothing on-demand. But the real story is now what they do, but how they do it. It makes for a great case study on both business model development and demand generation model creation. Inc. Magazine recently provided a full history and profile on the company and its founder, where you can enjoy the serendipity of how it all came together.  It will no doubt leave you cheering.

Figuring out your business model and figuring out your demand generation model have a key commonality. In both cases, you don’t know what is going to work until you actually do it. In the case of the Lolly Wolly Doodle’s demand generation model, it uses Facebook to gauge the real demand for new styles. And then scales up its efforts around the popular designs.

The great irony of the situation is that the company had only sold its ware on eBay while working out of the home garage. But faced with a batch of dresses that were subpar in quality, and not wanting negative feedback on eBay, the dresses needed to be offered at clearance pricing through a different channel. So they were offered up to a small lot of Facebook fans (153 of them in total) who had signed up at local Junior League events. Amazingly, the dresses sold out within minutes. The channel was then tested again with various designs that could be made to order, and the pace of sales continued. This was not a fluke. Within a couple of months, eBay was no longer a channel.

From a demand generation perspective, the eBay channel was doing just fine. The reputation was deemed the key metric, and the entrepreneur only turned to an alternate channel so as not to damage this reputation. But that alternate channel, Facebook, even with a small number of fans, leveraged itself by the sharing effect that occurred. And this is what moved Lolly Wolly Doodle to the next level. It had inadvertently found a way to match its products in the ideal context, the Facebook News Feed. This became like Tupperware parties on steroids. I can’t get over the irony of how it occurred. But the fact of the matter is that finding the right channel, where the context matches your offering, and people will give up their attention span, is the real nirvana of a demand generation model.

From a business model perspective, the company had effectively hit the fashion industry where it was weakest. It found a way to offer customizable children’s clothing on-demand. Whereas, the fashion business is all tied to high volumes, with lags in the ability to adjust to changing preferences. Again, the company’s on-demand model, the supply-chain model, and the manufacturing model, all evolved from the roots of business. That is, don’t waste material, but find a way to repurpose it. And organize the production by the type of process (i.e. similar cuts, or similar sewing), rather than by specific garment batches. And it all evolved from the genuine need to do things efficiently, economically, and have fast turn-around, while not being stuck with conventional processes.

This story can serve to inspire us all. You need to experiment with your demand generation techniques in order to know how it really is going to turn out. And you need to look for ways to do business that may not have been considered. If your offering has a solid value proposition, getting that message out in the right context can be truly transformational.

Filed Under: Business Model, Demand Generation

The Hands on the Clock

June 10, 2014 By Alex Grgorinic

When you look at the hands on the clock, they look like they are not moving. And so it may be in the day-to-day view of your business and the markets that you serve. If you keep staring at the same thing, change is not apparent. The status quo seems like a good strategy. But of course, things are always changing. And it depends more on how you look at things, in order to be able to garner new insights.

One of the keys to garnering those insights is to ensure that you soak up information from multiple channels. Your customers live across multiple channels or touch points and so should you. What you will pick up in each of them will always vary. And it is only by aggregating what you learn from each of them that you will be able to connect the dots of how your market is behaving.

Nate Silver predicted the results of the 2012 presidential election in all 50 states and the District of Columbia. Quite a feat. He accomplished this by aggregating hundreds of polls, at both the state level and the national level. There was no point in trying to pick a subset of polls as the source. Rather, he aggregated them and established a baseline based on historical performance of the individual polls. With a starting point for the weighting assigned to each individual poll, he applied a model of conditions which would adjust the weightings. The results of the prediction cannot be argued with. He got them all right. And this is amongst a whole lot of professionals claiming the presidential race was too close to call.

There is a lot of insight here in how to read your own market. Identifying all the channels through which information about your market and customers can be gleaned is important. Having an aggregate model which you adjust will keep you headed in the right direction, more so than any single source. Voters and customers are a lot alike. With customer research data and customer surveys, the same type of decision making occurs. Customers, the same as voters, think that they will do one thing, and then can change their minds. So it is only by using the equivalent of multiple polls that you can get a better picture of what is happening.

Within the business context, multiple polls translates into collecting input from multiple channels: multiple types of customers, multiple suppliers and multiple advisors too. You will benefit by aggregating those multiple perspectives. I myself have looked at multiple research reports which vary in their conclusions, sometimes quite diametrically opposed, even in surveying the same customer base. The key is not to try to pick the winner. Rather the larger insight will be gleaned if you can derive a superset model where all results hold their spot.

Even if the status quo is serving you well, it will be important for you to understand what is impacting your status quo. The internet tools today offer up so much in enabling you to build your own model. From A/B testing to Google Analytics, techniques and tools abound. Be sure that you are tapping the information resources that will enable you to see how your market is moving. Don’t be fooled, the hands on the clock are always moving.

Filed Under: Management Consulting

What is “Technological Advancement”?

June 5, 2014 By Alex Grgorinic

Hint: don’t look in the dictionary. As developers in Canada know, it is quite a nebulous thing. And in fact, the question seems to come with increasing frequency as one goes through an SR&ED audit. The more explanation that takes place, the greater is the struggle to understand it. The process of trying to understand it deteriorates into a process of explaining what one thinks it may be, followed by the question “Is this it?”

Well, why don’t we just scrap the whole phrase “technological advancement”? Advances in technology today are all about how we combine existing technologies in new ways. There are many real advancements occurring in this way. As Geoff Colvin describes in his recent Fortune column titled “Welcome to the era of Lego innovation”, breakthrough products are being created without creating new underlying technology.

Technology builds technology. But we are at a stage where there are so many “lego” pieces, that the opportunity and challenge clearly is in exploring how the different “lego” pieces can be fit together. We don’t necessarily need new lego pieces, to achieve advancements. We have so much of it. Hardware. Software. Sensors. It really is about how it can be combined and harnessed to better solve problems or introduce benefits that just could not previously be economically created.

Advancements largely do not occur in quantum steps. Rather they are mostly incremental. This shouldn’t be surprising to anyone who has searched through prior art. That exercise sheds light on the true nature of advancement; i.e. many incremental advances building a previous knowledge.

So what does it all mean? Clearly, we are in an age of progress where advancements are tied to the ability to be both imaginative and creative in the combination existing technologies. And certainly there will be some uncertainty that innovators must work through. All of this will be bring new economic growth and further advancement of both skills and knowledge.
Now for the uninitiated, they may believe that this all sounds like SR&ED. Unfortunately, that is not the case. The SR&ED process in Canada is continuing to get more difficult in establishing successful claims, for many projects that are making real advances and delivering on all those economic benefits that we cannot live without. The SR&ED administrators are gravitating to a purest state, where their orientation seems to be much better aligned with the university experiment, as opposed to real experimental development. As I see some great advancements that are belittled and claims that are drastically slashed, I can’t help but worry about the effect. Entrepreneurs who have made huge personal sacrifices feel that they are being crushed by the SR&ED program, rather than supported.

It really does signal that we are at a point in time where the SR&ED criteria should be changed to capture the true gains that we are trying to achieve. But don’t hold your breath. The outcome of the last major review in 2011 has not resulted in any improvements.

So where does that leave all the great potential claims? For starters, you cannot afford to have an over-zealous conviction that your work will qualify for SR&ED. It may not. You must learn about the process that the SR&ED auditor will follow in reviewing your claim, and how you need to present it. You need to really understand what you are getting into, before you invest considerable time and effort in all the prep work. Just as there are uncertainties in the development cycle, there are many interpretation uncertainties in administering the SR&ED criteria. If there were not, the SR&ED auditor would not have to conclude the audit with the phrase “In my opinion”.

Filed Under: SR&ED

The Answer is Not in the Back of the Book

June 2, 2014 By Alex Grgorinic

Strategic decisions these days are all about taking calculated risks based on good data. And that is the tough part. The good data. Indeed, we see firms like Gartner and Forrester growing steadily, and quoted widely by firms who want to strengthen the credibility in their message. In general this is a good thing. There is no question in my mind that both Gartner and Forrester produce high quality and professional research. And there is a lot of value in the focus and frameworks that they provide. But, it is still quite important to judge how the data is put together, before accepting the forecast and conclusions.

From my own experience, I had once the led product marketing thrust for an optical performance monitor for the telecom industry. Over the course of 2 years, I had introduced the module to every major switching equipment manufacturer in the telecom industry. But, as you may recall, the telecom industry went into quite a collapse in the period 2001-2002. At that point in time, there was incredible amount forecast scrutiny. Since I was looking after a product that had a bright future, the scrutiny was even greater. And with this scrutiny came the data that had been provided by a market research firm who was specializing in the telecom industry. The executive team presented the industry forecast data and was wondering about the disparity with my own figures. When I looked at the data, I found it very interesting. The market segmentation was completely consistent with my own assessment. But the forecasted quantities differed somewhat significantly. I came to the conclusion that clearly we talked to the same customers but filled in the blanks differently, in order to arrive at the forecast. Unfortunately I turned out to be correct, and the growth was slowing down at a rapid pace.

You need a fortune teller to predict the future. Since the answer to your questions is not in the back of the book, the fortune teller needs to have excellent skills in interpolation and extrapolation. Peter Drucker is often quoted as saying that no-one can predict the future. The best that you can do, is identify patterns of things that have already occurred. But where things are going in terms of both direction and speed is a tough one to get right. You need to get close enough to the field of study; you have to hone your skills of perception and intuition; and you still need to have a manner in which you can stay objective. And that is the real tough part. Picking the right dots and connecting them the right way.

To grow as a company, you need to make strategic bets of the future. To do this without losing the farm means you need to establish a systematic process by which you can gather and assess different sources of information. Customers, suppliers and competitors are all contributors to your knowledge base. And so are research firms and consultants who can bring an outside perspective to your process. Blending it all will identify strategic choices, while reducing risk. But remember, it will never be a clear cut answer and it is not in the back of the book.

Filed Under: Management Consulting

« Previous Page
Next Page »

Copyright © 2025