Are you looking at getting SAP Hana for your firm? Have you see the price of Hana these days? $140,000 per node (Node =64GB)?
So you really want to get some data and grind away using SLT, ODBC/JDBC with multiple systems BUT then the Hardware cost creep in and the additional nodes to make it a compelling solution.
Well, do we have a plan for you. Thanks to the Approyo reseller relationship with SAP AG, we will BUY your licenses for you. Yes, you heard us correctly, we will acquire the needed license, take them and load them into our state of the art "Private Cloud" for your teams corporate usage. You then are paying for your total solution as a monthly OPEX rather then a CAPEX and you get the POWER of SAP Hana.
How can we do this and what is the real charges at the end of the day, oh and what is my access? you say....
All great questions so let's lay them out 1 by 1.
How can we do this? We are a SAP reseller partner working specifically with SAP and SAP AG to provided a clouded solution to our customers. Thanks to SAP we can offer an all in-composing solution lock, stock & barrel.
What is the real charges at the end of the day? Take the licenses, the cost of the infrastructure support of both, storage, networking etc and you have a total cost which then gets divided by the number of months of a term agreement and you have a low cost monthly option of Enterprise Hana or BW on Hana or maybe you want Business Suite on Hana, well here it is.
What is my access? 24x7x365 from any where you want to be or your team is.
So, at the end of the day you want to get into Hana, you want to try Hana we can get you up and running in a private cloud, in our PoC for less then you think, and the great part is when you love it, you don't lose any of the PoC build we migrate it right on over to your new production solution.
How's that for never losing sight of your needs?
Lets look at an actual scenario for you:
256GB SAP Hana license (1.5TB uncompressed data)
256GB Prd. Hana Cloud
256GB Dev Hana Cloud
All Support, All Back-up's
All Networking, SLT connections etc.
42x7 support
72 month term for less then you can imagine!
All you do is log onto the web page and away you go...
Let's chat when your free to discuss getting SAP Hana up and running for you!
To reach Approyo call Marcus at 404-448-1166 or marcus@approyo.com
Friday, June 28, 2013
Thursday, June 6, 2013
NFL Player Analysis with Lumira Suite by: Drew LeBlanc
As the temperature rises and summer goes into full swing we know this signals the start of a number of things- cookouts, pool and beach time, summer camps for the kids and of course football season. While that last one may seem like the odd man out here, true football fans know that it is never too early to be talking about next season. As a New England Patriots fan, one of the biggest stories of the off season was the replacement of our longtime wide receiver Wes Welker with a relatively newer player Danny Amendola. For those that aren’t familiar with the names or situation, basically it was a case of an older, aging player (Wes Welker) who wanted a lot of money and a team deciding they would rather spend less on a younger, but more unproven player (Danny Amendola).
The debate of which way the team should have gone was a rather polarizing one, leaving diehard fans on both sides of the fence. But while most people were talking about the stats and the player’s on field performance, they were neglecting maybe the most important piece: the money. To help settle this debate I decided to turn to the numbers and compare both the performance numbers as well as the less talked about salary figures, and what better way to do it than with the brand new SAP Lumira Suite!
The first step I took was collecting the data. While I already had the player performance data like yards and touchdowns, I needed to combine this with the player’s salary so that I could see the correlation. Once I collected the salary data (from spotrac.com) I used SAP Lumira Desktop to merge it with the player performance data that was already loaded. As you can see in the picture I get a 12% match with the player ID key which would be higher but I only pulled salary data for a select few players that I wanted to compare.
Now that I merged my data together I was able to compare the player’s on the field numbers with how much they were being paid. In the first chart below I left the salary numbers out, using only the total touchdowns scored and how many yards each player gained as the two values (the two main factors players are measured on). I filtered on all of the Patriots running backs and receivers from last year as well as the new player, Danny Amendola, so that we could compare. Because there are a good deal of injuries in football and players miss various amounts of games, I averaged the totals to see how many yards and touchdowns these players earned per game instead of comparing the season totals.
In the chart there are really two outliers: the yellow dot which represents the tight end Rob Gronkowski whose average of over one touchdown per game is pretty astounding. The other outlier is our topic of conversation Wes Welker who is fairly high on touchdowns but stands out even more for the yards gained per game at almost 100. Based on this chart we can see why people are upset that he’s leaving. Our new player Danny Amendola is represented by the light green dot that falls in the middle, not terrible but certainly not the caliber of the other two players mentioned.
Now let’s bring the salary numbers into the comparison. In the chart below we’ve added the players 2012 salaries, which are represented by the bubble size with the largest bubbles being the biggest salaries.
By adding the salaries in, it really changes the dynamic of how we look at each of these players. While in the previous chart we saw Wes Welker (represented as the dark green bubble above) as one of the best players in the grouping, it is obvious here that he is also by far the highest paid player in the group. In comparison, the next closest player Aaron Hernandez (dark blue bubble) has a salary three times less than Welker, even though his performance numbers are only incrementally smaller than Welker’s. After looking at this chart, it starts to become clear that while Welker may have been the best player on the field, he certainly was not the best value of performance per dollar.
While this comparison alone was valuable, I wanted to compare multiple other measures at the same time as well as share my findings with the colleagues who I’ve so hotly debated the topic with, so the next step was publishing this to the Lumira Cloud.
Once published into the cloud, I was now able to analyze the data even further, bringing in up to eight different measures and dimensions at the same time.
One of the other parts I wanted to analyze was that the Patriots only wanted young players and Welker was too old for them. In the chart below I took all the players we saw before and grouped them by the year they entered the league. We then plotted it by their 2012 salary (y-axis) and 2013 salary (x-axis) along with the performance data points represented by color and size. When we analyze this it becomes clear that the Patriots are in fact investing the majority of their money in young talent and getting a great return on it. You can see Welker represented by 2004 at the top right of the chart, but the bulk of the Patriot’s money is actually going to players 6+years younger with 2010 being further right along the salary axis. The size and dark color of 2010 and 2011 also denotes the fact that this is where the bulk of the Patriot’s stats are coming from as well.
Finally, I wanted to specifically compare Danny Amendola with Wes Welker in a week by week comparison from last year. Using the animation feature in Lumira Cloud I was able to roll through each week quickly comparing how both players performed throughout the season. While it doesn’t show up as great on paper, you can see in the example below how I can click play and stream through each week with the added animation feature and see that Amendola actually had several weeks where he out performed Welker.
In the end, the final judgment of what player the Patriots should have signed will be made on the field in this upcoming season. But through analytics we
were able to compare not only the often talked about on the field stats but the less talked about factors like salary and age which contributes just as much if not more than the on the field stats to the decision making process. After all, this is a business and just like any other business what it ultimately comes
down to is a return on investment, and what this showed us was that maybe it’s not as simple as just taking the best player on the field.
were able to compare not only the often talked about on the field stats but the less talked about factors like salary and age which contributes just as much if not more than the on the field stats to the decision making process. After all, this is a business and just like any other business what it ultimately comes
down to is a return on investment, and what this showed us was that maybe it’s not as simple as just taking the best player on the field.
Clearing the Big Data Hurdle: The Open Source Advantage
By Christopher M. Carter,
In today’s world there is a new understanding, the emergence of a new “reality” that is much, much different than what we had even a decade ago. This new reality of big data that exists within today’s enterprises cannot be underestimated. Big data is becoming more important in all industries, but none more so than in the finance arena, both in enterprises and big finance in Wall Street firms. Most businesses aren't ready to manage this flood of data, much less do anything interesting with it.
Big data will impact every industry, from finance to education and government. In fact, the Federal government just announced a new big data research initiative, with a budget of $200 million.
Data as a whole is a catalyst for business. According to IDC, there will be 2.7 zeta bytes of data created this year alone. Now, if you look into the enterprise, you begin to see that in order to begin analysing and deriving value from these increasingly large data sets, organizations need to embrace the right tools that will allow for these new capabilities. As businesses begin to better understand their existing data, they can gain competitive advantage in the process, however, that competitive advantage can only be realized if data can be processed intelligently, efficiently and results delivered in a timely manner.
How does the enterprise begin to mine its data? Good question. With so much data existing that firms can become overwhelmed, how can the good data be identified? What is “needed” data and what information is not as valuable? The old mantra of “good data in, good data out; bad data in, bad data out” can help to start answering these questions. All firms need to be cognizant, first and foremost, of the quality of the data being entered into their systems and used in daily operations. This is especially important in industries like finance, where data is the lifeblood of the business.
Opportunities abound in big data, and an organisation can get as much potential knowledge out of this stored data as they put energy into analysing it. With applications spanning from Business Objects from SAP and the usage of in-memory data from Hana to newer applications, members of the finance sector are looking to add new positions like a Chief Data Officer specifically to make the key decisions around information that need to be made today. Big data is indeed big, but it's not for all purposes. For example, it’s not for transactional or real-time in-memory processing of small and endless streams of structured data. Think of data like a big truck vs. a small sedan - each has its purpose. However, both Big data and fast in-memory traditional databases have a place in driving business.
Opportunities in harnessing and utilising big data become more feasible when open source frameworks come into play. The open source world has basically created the new age of big data analytics, be it when utilising Hadoop, the most widely used and well-known solution for developers, to products like Greenplum from EMC and others. These tools have created a rush to market to support organisations trying to compute as much data as fast as they can with a solution that will allow them to make decisions in as real time as they can. For example, a major retailer with outlets around the globe, utilising an open source framework, has the ability to harness the data coming in from their social media sites, run it through their enterprise data analytics solution, utilising literally thousands of nodes, to make real time decisions in their stores about products and pricing.
Three to five years ago this was not possible. But, with the large and active open source community working on the framework this computation ability now exists and is being utilised and modified by new companies daily.
Corporations are looking at their data as an asset within their walls no matter where it physically resides, but yet there is still so much to learn and to dig through. The new Chief Data Officer and their team must stay vigilant and be concerned about many factors that will directly impact the business, including how and what data is being provided to regulators. Enterprises need to set standards when it comes to their information, and this is more important than ever in the increasingly regulatory-focused landscape. Firms need to insure their internal processes are in place for current government regulatory requirements, as well as taking into account regulations in many of the new laws that are being created, seemingly on the fly.
There is no doubt that bringing the power of big data and harnessing its performance is important and that it will become more strategic when considering how organisations will use the data to interact with their clients, competitors and the market through faster decision-making. Some companies will start to shrink under the pressure of this new data analysis, while some may indeed fail completely. But regardless of which companies falter, and which ones gain market share, one thing is for certain: database companies should see tremendous gains as the need for more and more database applications increases.
Organisations are looking to the future and deciding how important a role big data will play in the coming years. The truth is, how firms utilize big data as a source of knowledge and power will be the largest influence. These enterprises that find success with adopting open source tools to analyse their information will see improved profitability, provide stronger service throughout the organisation and to their customers and rise above in the land of giants.
SAP Hana solution "Ignite"
Ignite by Approyo is an SAP HANA solution in which we provide you a solution for faster and easier SAP HANA integration of data to be used from multiple 3rd party solutions for dashboards and reports.
App Highlights
- Faster data dashboards
- Easier data integration
- Easier Dashboard implementation
- Speed to data transaction
Ignite by Approyo provides you a strong out of the box SAP HANA dashboard solution that integrates SAP HANA and the data you need to review instantly to allow your CFO to have their own dashboard of data while your AP & Marketing can each have their own. We recommend downloading Ignite and "Ignite" your ability to view processed data faster today. If you have any additional questions you can reach Approyo directly. info@approyo.com
Subscribe to:
Posts (Atom)