Showing posts with label Business intelligence. Show all posts
Showing posts with label Business intelligence. Show all posts

Monday, June 17, 2013

What can Business Intelligence do for you?

Source:http://www.ironsidegroup.com/

Friday, May 24, 2013

Artificial intelligence marketing

Artificial intelligence marketing (AIM) is a form of direct marketing leveraging database marketing techniques as well as AI concept and model such as machine learning and Bayesian Network. The main difference resides in the reasoning part which suggests it is performed by computer and algorithm instead of human.

Collect, reason, act

Artificial intelligence marketing principle is based on the perception-reasoning-action cycle you find in cognitive science. In marketing context this cycle is adapted to form the collect, reason and act cycle.

Collect

This term relates to all activities which aims at capturing customer or prospect data. Whether taken online or offline these data are then saved into customer or prospect databases.

Reason

This is the part where data is transformed into information and eventually intelligence or insight. This is the section where artificial intelligence and machine learning in particular have a key role to play.

Act

With the intelligence gathered from the reason step above you can then act. In marketing context act would be some sort of communications that would attempt to influence a prospect or customer purchase decision using incentive driven message
Again artificial intelligence has a role to play in this stage as well. Ultimately in an unsupervised model the machine would take the decision and act accordingly to the information it receives at the collect stage.


Business Intelligence 2.0

Business Intelligence 2.0 (BI 2.0) is a set of new tools and software for business intelligence, beginning in the mid-2000s, that enable, among other things, dynamic querying of real-time corporate data by employees, and a more web- and browser-based approached to such data, as opposed to the proprietary querying tools that had characterized previous business intelligence software.
This change is partly due to the popularization of service-oriented architectures (SOA), which enables for a flexible, composable and adaptive middleware. Also, open standards for exchanging data such as XBRL (Extensible Business Reporting Language), Web Services and various Semantic Web ontologies enable using data external to an organization, such as benchmarking type information.
Business Intelligence 2.0 is most likely named after Web 2.0, although it takes elements from both Web 2.0 (a focus on user empowerment and community collaboration, technologies like RSS and the concept of mashups), and the Semantic Web, sometimes called "Web 3.0" (semantic integration through shared ontologies to enable easier exchange of data).
According to analytics expert Neil Raden, BI 2.0 also implies a move away from the standard data warehouse that business intelligence tools have used, which "will give way to context, contingency and the need to relate information quickly from many sources."

Lessons from social media

According to president and CEO Greg Nelson, BI 2.0 has a lot to learn from social media, such as Twitter, Facebook, blogs etc. He lists, in the white paper Business Intelligence 2.0: Are we there yet?, lessons and opportunities that BI can learn from social media.
Lessons and opportunities:
From Facebook the following can be learned: Destination naturally forces people to login and participate. Continuous flow of information. Provides environment for developers to create their own applications. Post things of interest (reports, graphics, interpretations) to my personal page (things I have discovered.)
From Twitter the following can be learned: Real-time, continuous flow of decisions, status about the business, complex event processing. Platform evolves through unplanned usage/organic evolution of capabilities. Succinct explanation of the state of the business. Search commentary; generate word clouds that provide a visualization of the “vibe” or sentiment of the business. Tags and users comments. Submit note-worthy information (anything on the web you think is newsworthy) – associate it with data or objects.
From blogs the following can be learned: Information can be interpreted and informally published to a group of interested parties.
From RSS the following can be learned: Commentary should be available as an RSS feed. Reports or data updates could be delivered via RSS feed.

The future of BI 2.0

According to president and CEO Greg Nelson the future of BI will see a great change when it comes to BI 2.0 In his 2010 white paper he concludes the following:
  1. Decisions, facts and context will be developed through crowdsourcing.
  2. Similarly, data and reports will incorporate narrative context information supplied by users. For example, data points and graphs annotated with descriptive insight directly alongside the results.
  3. Data will have a more direct linkage with action. When you see something wrong, the data will tell you where it is going wrong and why. Exceptions, alerts and notifications will be based on dynamic business rules that learn about your business and what you are interested in.
  4. People will be able to directly act on information. Interactions with operational systems, requests for information, comments, “start a discussion”, provide supporting information, “become a follower” of the metric, and “rate or report a problem” will reside alongside the data.
  5. Business decisions shall be monitored so that interventions and our hypotheses about business tactics will be tagged in the context of the data that measures its effect. Our ability to test a hypothesis will be integrated into our decision support systems. Say, for example, we see something in the data; we explore it; we understand its root cause; and design an intervention to deal with it. We will be able to tag interventions or events that have happened and have that appear in the context of the reporting of the data so that over time our collective knowledge about the world will be captured alongside the data and artifacts.
  6. Visualizing data and complex relationships will be easier and more intuitive models of info-graphics will become mainstream. Tools will have the ability to create graphic representations of the data based on what it “sees” and displays the best visual display given what it has. Furthermore, the tools will learn what visualizations work best for you and your environment.
  7. The ability to detect complex patterns in data through automated analytic routines or intelligent helper models will be built into analytic applications.
  8. Finding information will be easier and search results will provide context so that we know when we have the right results. Users will have the ability to tag specific data elements at various levels (page, widget, some aspect of the data presentation – row, column, cell, line, point) or an abstract interpretation of the results. Anyone looking at the same data will see that context when viewed.
  9. Linkages with unstructured contents such as documents, discussions and commentary as well as a knowledge base of previously answered requests will be key to ensuring collective knowledge and collaboration.
  10. Technical, process and business event monitoring will allow streamlined operational processes (Business Process Engineering, Business Activity Monitoring, Business Rules Engineering) and learning models will be applied to organizational flow of data.* Nelson, Greg (2010).





Saturday, May 18, 2013

Introduction to business intelligence


Data visualization is the study of the visual representation of data, meaning "information that has been abstracted in some schematic form, including attributes or variables for the units of information".

According to Friedman (2008) the "main goal of data visualization is to communicate information clearly and effectively through graphical means. It doesn’t mean that data visualization needs to look boring to be functional or extremely sophisticated to look beautiful. To convey ideas effectively, both aesthetic form and functionality need to go hand in hand, providing insights into a rather sparse and complex data set by communicating its key-aspects in a more intuitive way. Yet designers often fail to achieve a balance between form and function, creating gorgeous data visualizations which fail to serve their main purpose — to communicate information".
Indeed, Fernanda Viegas and Martin M. Wattenberg have suggested that an ideal visualization should not only communicate clearly, but stimulate viewer engagement and attention.
Data visualization is closely related to information graphics, information visualization, scientific visualization, and statistical graphics. In the new millennium, data visualization has become an active area of research, teaching and development. According to Post et al. (2002), it has united scientific and information visualization. Brian Willison has demonstrated that data visualization has also been linked to enhancing agile software development and customer engagement.
KPI Library has developed the “Periodic Table of Visualization Methods,” an interactive chart displaying various data visualization methods. It includes six types of data visualization methods: data, information, concept, strategy, metaphor and compound.

-----------

See This Video Too


What is Management Information System ?

 
A management information system (MIS) provides information that organizations need to manage themselves efficiently and effectively.Management information systems are typically computer systems used for managing five primary components: hardware, software,data (information for decision making), procedures (design,development and documentation), and people (individuals, groups, or organizations). Management information systems are distinct from other information systems, in that they are used to analyze and facilitate strategic and operational activities. Academically, the term is commonly used to refer to the study of how individuals, groups, and organizations evaluate, design, implement, manage, and utilize systems to generate information to improve efficiency and effectiveness of decision making, including systems termed decision support systems, expert systems, and executive information systems.Most business schools (or colleges of business administration within universities) have an MIS department, alongside departments of accounting, finance, management, marketing, and sometimes others, and grant degrees (at undergrad, masters, and PhD levels) in MIS.

---------------
 See These Videos

Information Systems Management - Video Presentation

What is Decision Support System ?

A decision support system (DSS) is a computer-based information system that supports business or organizational decision-making activities. DSSs serve the management, operations, and planning levels of an organization and help to make decisions, which may be rapidly changing and not easily specified in advance. Decision support systems can be either fully computerized, human or a combination of both.
DSSs include knowledge-based systems. A properly designed DSS is an interactive software-based system intended to help decision makers compile useful information from a combination of raw data, documents, and personal knowledge, or business models to identify and solve problems and make decisions.
Typical information that a decision support application might gather and present includes:
  • inventories of information assets (including legacy and relational data sources, cubes, data warehouses, and data marts),
  • comparative sales figures between one period and the next,
  • projected revenue figures based on product sales assumptions.

Benefits

  1. Improves personal efficiency
  2. Speed up the process of decision making
  3. Increases organizational control
  4. Encourages exploration and discovery on the part of the decision maker
  5. Speeds up problem solving in an organization
  6. Facilitates interpersonal communication
  7. Promotes learning or training
  8. Generates new evidence in support of a decision
  9. Creates a competitive advantage over competition
  10. Reveals new approaches to thinking about the problem space
  11. Helps automate managerial processes
  12. Create Innovative ideas to speed up the performance

DSS Characteristics and capabilities

  1. Solve semi-structured & Unstructured problems
  2. Support To Managers At All Levels
  3. support Individual and groups
  4. Inter dependence and Sequence Decision.
  5. Support Intelligence, Designee,Choice.
  6. Adaptable & Flexible
  7. Interactive and ease of use
  8. Interactive and efficiency
  9. Human control the process
  10. Ease of development by end user
  11. Modeling and Analysis
  12. Data Access
  13. Stand alone Integration & Web Based
  14. Support Varieties Of Decision Process
-----------------
DSS Vs. DMS

 -------------
This Videos will help too :

What is Business Performance Management ?

 

Business performance management is a set of management and analytic processes that enables the management of an organization's performance to achieve one or more pre-selected goals. Synonyms for "business performance management" include "corporate performance management (CPM)"and "enterprise performance management".
Business performance management is contained within approaches to business process management.
Business performance management has three main activities:
  1. selection of goals,
  2. consolidation of measurement information relevant to an organization’s progress against these goals, and
  3. interventions made by managers in light of this information with a view to improving future performance against these goals.
Although presented here sequentially, typically all three activities will run concurrently, with interventions by managers affecting the choice of goals, the measurement information monitored, and the activities being undertaken by the organization.
Because business performance management activities in large organizations often involve the collation and reporting of large volumes of data, many software vendors, particularly those offering business intelligence tools, market products intended to assist in this process. As a result of this marketing effort, business performance management is often incorrectly understood as an activity that necessarily relies on software systems to work, and many definitions of business performance management explicitly suggest software as being a definitive component of the approach.
This interest in business performance management from the software community is sales-driven - "The biggest growth area in operational BI analysis is in the area of business performance management."
Since 1992, business performance management has been strongly influenced by the rise of the balanced scorecard framework. It is common for managers to use the balanced scorecard framework to clarify the goals of an organization, to identify how to track them, and to structure the mechanisms by which interventions will be triggered. These steps are the same as those that are found in BPM, and as a result balanced scorecard is often used as the basis for business performance management activity with organizations.


-------------
This video will help for more understanding
 Cubes Corporate Performance Management

What is Dashboard ?

a dashboard is "an easy to read, often single page, real-time user interface, showing a graphical presentation of the current status (snapshot) and historical trends of an organization’s key performance indicators to enable instantaneous and informed decisions to be made at a glance."
For example, a manufacturing dashboard may show key performance indicators related to productivity such as number of parts manufactured, or number of failed quality inspections per hour. Similarly, a human resources dashboard may show KPIs related to staff recruitment, retention and composition, for example number of open positions, or average days or cost per recruitment.
examples :

 
 
 
 




Why Business Intelligence Projects Fail ?

We all know that the BI is very important for all the company .
but why some BI solutions fail to help some companies ?


Here are some resons :
1)      No Executive sponsorship: Most BI projects are highly cross-functional in nature. It needs collaboration from various departments within an organization. Unless there is executive buy-in and commitment, seldom do departments come together in a timely fashion and collaborate, as every department has its own priorities and targets to meet.
2)      Underestimating the commitment levels, cost and effort: Let us be honest – to have a really meaningful BI solution that helps strengthen the organization’s top line and bottom line, it takes sustained effort, which could translate into a considerable commitment and cost. Many BI projects are abandoned mid-way and deemed a failure as the executive sponsors are not briefed about the needed commitment in advance.
3)      Biting more than what can be chewed: The old adage “Don’t bite more than you can chew” is still valid today! Most BI teams get into a ‘monument building’ mode right from the word go. The best way to succeed in a BI project is to intelligently prioritize business needs and break them into manageable chunks.
4)      Making IT drive BI projects: The term BI stands for Business Intelligence – it means it is intimately related to the business! Ideally the business should be driving it with collaboration and support from IT. Many organizations let IT drive their BI projects from concept to completion with minimal involvement of the business. This is a sure recipe for mismatched expectations and unexpected results, often leading to failure.
5)      Trying to Fit a Square Peg into a Round Hole: This cliché is very much true in most cases where BI needs / requirements are not understood clearly and the project team tries to fit the business needs into an incorrect technology solution.

----------------
This video too speak about that issue ..

Why Business Intelligence Projects Fail

What is Predictive Analytics ?

Predictive analytics encompasses a variety of techniques from statistics, modeling, machine learning, and data mining that analyze current and historical facts to make predictions about future, or otherwise unknown, events.
In business, predictive models exploit patterns found in historical and transactional data to identify risks and opportunities. Models capture relationships among many factors to allow assessment of risk or potential associated with a particular set of conditions, guiding decision making for candidate transactions.
Predictive analytics is used in actuarial science, marketing, financial services,insurance, telecommunications,retail, travel,healthcare,pharmaceuticals and other fields.
One of the most well known applications is credit scoring,which is used throughout financial services. Scoring models process a customer's credit history, loan application, customer data, etc., in order to rank-order individuals by their likelihood of making future credit payments on time. A well-known example is the FICO score.

 -----------
and here is the PA cycle

 -----------

to make it simple see this video :

Friday, May 17, 2013

What is Business Analytics ?

Business analytics (BA) refers to the skills, technologies, applications and practices for continuous iterative exploration and investigation of past business performance to gain insight and drive business planning.Business analytics focuses on developing new insights and understanding of business performance based on data and statistical methods. In contrast, business intelligence traditionally focuses on using a consistent set of metrics to both measure past performance and guide business planning, which is also based on data and statistical methods.
Business analytics makes extensive use of data, statistical and quantitative analysis, explanatory and predictive modeling, and fact-based management to drive decision making. It is therefore closely related to management science. Analytics may be used as input for human decisions or may drive fully automated decisions. Business intelligence is querying, reporting, OLAP, and "alerts."
In other words, querying, reporting, OLAP, and alert tools can answer questions such as what happened, how many, how often, where the problem is, and what actions are needed. Business analytics can answer questions like why is this happening, what if these trends continue, what will happen next (that is, predict), what is the best that can happen (that is, optimize)


See this video too show the importance of BA


What is Data Replication ?

Replication involves sharing information so as to ensure consistency between redundant resources, such as software or hardware components, to improve reliability, fault-tolerance, or accessibility.

Replication in distributed systems

Replication is one of the oldest and most important topics in the overall area of distributed systems.
Whether one replicates data or computation, the objective is to have some group of processes that handle incoming events. If we replicate data, these processes are passive and operate only to maintain the stored data, reply to read requests, and apply updates. When we replicate computation, the usual goal is to provide fault-tolerance. For example, a replicated service might be used to control a telephone switch, with the objective of ensuring that even if the primary controller fails, the backup can take over its functions. But the underlying needs are the same in both cases: by ensuring that the replicas see the same events in equivalent orders, they stay in consistent states and hence any replica can respond to queries.
---------------

 ----------
This Video show it in details :

Thursday, May 16, 2013

What Is Data Management Maturity ?

The Data Management Maturity (DMM) Model defines the components, business processes and capability areas required for effective data management. It provides standard assessment criteria that organizations can use to evaluate data management goals against documented best practice. It defines the ‘what and why’ of data management at both an objective level and from the perspective of practical implementation.
The DMM was created via collaboration of data management practitioners, operations managers, IT professionals and representatives of lines-of-business across the financial industry. It establishes the assessment criteria and requirements for achieving alignment on strategy, implementing governance mechanisms, managing operational components, defining dependencies, aligning data with IT capabilities, ensuring data quality and integrating data into business processes.


---------------------------------------------------------------------------------------
See this video :


What is Data Warehouse ?

first step to know what is Data Warehouse is to see this image :






Second Step is to see this nice video
 
 ---
and you can see the wiki definition too :
or enterprise data warehouse (DW, DWH, or EDW) is a database used for reporting and data analysis. It is a central repository of data which is created by integrating data from one or more disparate sources. Data warehouses store current as well as historical data and are used for creating trending reports for senior management reporting such as annual and quarterly comparisons.
The data stored in the warehouse are uploaded from the operational systems (such as marketing, sales etc., shown in the figure to the right). The data may pass through an operational data store for additional operations before they are used in the DW for reporting.
The typical ETL-based data warehouse uses staging, data integration, and access layers to house its key functions. The staging layer or staging database stores raw data extracted from each of the disparate source data systems. The integration layer integrates the disparate data sets by transforming the data from the staging layer often storing this transformed data in an operational data store (ODS) database. The integrated data are then moved to yet another database, often called the data warehouse database, where the data is arranged into hierarchical groups often called dimensions and into facts and aggregate facts. The combination of facts and dimensions is sometimes called a star schema. The access layer helps users retrieve data.
A data warehouse constructed from an integrated data source systems does not require ETL, staging databases, or operational data store databases. The integrated data source systems may be considered to be a part of a distributed operational data store layer. Data federation methods or data virtualization methods may be used to access the distributed integrated source data systems to consolidate and aggregate data directly into the data warehouse database tables. Unlike the ETL-based data warehouse, the integrated source data systems and the data warehouse are all integrated since there is no transformation of dimensional or reference data. This integrated data warehouse architecture supports the drill down from the aggregate data of the data warehouse to the transactional data of the integrated source data systems.
Data warehouses can be subdivided into data marts. Data marts store subsets of data from a warehouse.
This definition of the data warehouse focuses on data storage. The main source of the data is cleaned, transformed, cataloged and made available for use by managers and other business professionals for data mining, online analytical processing, market research and decision support (Marakas & O'Brien 2009). However, the means to retrieve and analyze data, to extract, transform and load data, and to manage the data dictionary are also considered essential components of a data warehousing system. Many references to data warehousing use this broader context. Thus, an expanded definition for data warehousing includes business intelligence tools, tools to extract, transform and load data into the repository, and tools to manage and retrieve metadata.




What is OLAP ?

OLAP is Online analytical processing

wikipedia says  :
online analytical processing, or OLAP , is an approach to answer multi-dimensional analytical (MDA) queries swiftly. OLAP is part of the broader category of business intelligence, which also encompasses relational database, report writing and data mining. Typical applications of OLAP include business reporting for sales, marketing, management reporting, business process management (BPM), budgeting and forecasting, financial reporting and similar areas, with new applications coming up, such as agriculture The term OLAP was created as a slight modification of the traditional database term OLTP (Online Transaction Processing).
OLAP tools enable users to analyze multidimensional data interactively from multiple perspectives. OLAP consists of three basic analytical operations: consolidation (roll-up), drill-down, and slicing and dicing. Consolidation involves the aggregation of data that can be accumulated and computed in one or more dimensions. For example, all sales offices are rolled up to the sales department or sales division to anticipate sales trends. By contrast, the drill-down is a technique that allows users to navigate through the details. For instance, users can view the sales by individual products that make up a region’s sales. Slicing and dicing is a feature whereby users can take out (slicing) a specific set of data of the OLAP cube and view (dicing) the slices from different viewpoints.
Databases configured for OLAP use a multidimensional data model, allowing for complex analytical and ad-hoc queries with a rapid execution time. They borrow aspects of navigational databases, hierarchical databases and relational databases.



---------------------
See This Video :

Tuesday, May 14, 2013

What is IBM Cognos TM1 ?


 -----------------------
IBM Cognos TM1 (formerly Applix TM1) is enterprise planning software used to implement collaborative planning, budgeting and forecasting solutions, as well as analytical and reporting applications. Data in IBM Cognos TM1 is stored and represented as multidimensional OLAP cubes, with data being stored at the "leaf" level. Computations on the leaf data are performed in real-time (for example, to aggregate numbers up a dimensional hierarchy). IBM Cognos TM1 includes a data orchestration environment for accessing external data and systems, as well as capabilities designed for common business planning and budgeting requirements (e.g. workflow, top-down adjustments).

What is KPI ?

KPI is Key Performance Indicators



Like These


-----------

See what wiki say about KPI ..

A performance indicator or key performance indicator (KPI) is a type of performance measurement. An organization may use KPIs to evaluate its success, or to evaluate the success of a particular activity in which it is engaged. Sometimes success is defined in terms of making progress toward strategic goals, but often success is simply the repeated, periodic achievement of some level of operational goal (e.g. zero defects, 10/10 customer satisfaction, etc.). Accordingly, choosing the right KPIs relies upon a good understanding of what is important to the organization. 'What is important' often depends on the department measuring the performance - e.g. the KPIs useful to finance will be quite different than the KPIs assigned to sales. Since there is a need to well understand what is important (to an organization), various techniques to assess the present state of the business, and its key activities, are associated with the selection of performance indicators. These assessments often lead to the identification of potential improvements, so performance indicators are routinely associated with 'performance improvement' initiatives. A very common way to choose KPIs is to apply a management framework such as the balanced scorecard.

----
This Video also is good



What is SpotFire ?










Spotfire was a business intelligence company based in Somerville, Massachusetts. It was bought by TIBCO in 2007.

TIBCO Spotfire is a software platform that allows customers to analyze data using statistics. The latest release, 4.0, includes integration with "Statistics Services" and the ability to develop dynamic analytic applications that run on the web through a client called TIBCO Spotfire Web Player. In addition, TIBCO Spotfire delivers enterprise-strength self-service predictive analytics to speed decisions and help customers achieve a two-second advantage™. The new TIBCO® Enterprise Runtime for R engine within TIBCO Spotfire 5.0 brings scalability and stability to agile R language and enables users to adapt and seamlessly implement enterprise-grade predictive models in hours, not days. Tibco Spotfire is already a popular product for finding the proverbial needles in the haystack, but customers are telling Tibco that the haystacks are getting bigger. Oil and gas operations, for example, are gathering more information from well sensors. Manufacturers are grabbing more production data off of shop floors. And marketers are gathering more customer analysis and segmentation information through click streams and social networks. As the data stacks up, the nature of the analysis changes, according to Steve Farr, a Tibco Spotfire senior product marketing manager.
"It's tempting to think you can identify the important trends in a small subset of the data, but big data analytics is about considering all of the data so you also see the exceptions and outliers," Farr told InformationWeek. "It's in the outliers that you find fraud, risk, and the things that are growing wrong." You can also find latent patterns in that bigger picture that reveal opportunities.
Spotfire competes with the likes of Tableau Software on data visualization and QlikTech on delivering analytic dashboards and applications. All three vendors are growing quickly on the appeal of fast and intuitive in-memory analysis. To deliver that capability at a bigger scale, a rewrite of the in-memory engine in Spotfire 5.0 takes better advantage of high-capacity, multi-core servers, according to Farr. "We're seeing very little degradation of performance as we add rows of data because we've rewritten in a way that throws all the power of the hardware at the data analysis,". But in-memory capacity takes you only so far. When it comes to truly high-volume data analysis up into the tens or hundreds of terabytes or more, in-database analysis is the emerging analytical approach. It lets you apply analytics within the powerful database platforms in which large data sets reside. Users benefit because they don't waste time extracting and moving data, handling analyses on under-powered analytic servers, and then returning result sets back to the database platform. It all happens inside the database.
Tibco is a latecomer to this approach, as SAS, SPSS, Alpine, and other analytics vendors have already ventured into in-database analytics. But Spotfire customers don't want to be left out, according to Farr, so the 5.0 release supports in-database analysis in conjunction with Oracle, Microsoft SQL Server, and Teradata data warehouse platforms. Support for in-database analysis within EMC Greenplum, IBM Netezza, and SAP Hana is also being discussed, but they didn't make the first cut.
The customers most likely to take advantage of in-database processing are those with the largest data volumes. Procter & Gamble, for example, uses Spotfire to serve up data visualizations to some 60,000 employees. It currently uses SAS for most predictive analyses, but the 5.0 release appears to have opened the door to use of Spotfire's predictive capabilities. "We are excited by the prospect of Spotfire 5.0 being able to efficiently analyze and visualize extreme data volumes by executing analytics directly within our database architecture," said Alan Falkingham, P&G's director, business intelligence, in a statement from Tibco. P&G's database platform is Oracle Exadata.
Tibco was an early supporter of analytics based on the popular R programming language with a release back in 2010. The 5.0 release improves on that support with a new Tibco Enterprise Runtime for R, which embeds the open-source R runtime engine into the Spotfire statistical server. This lets customers expose R to a much larger audience of users--beyond the Ph.D crowd--within a simple, object-oriented programming environment.


------

See This Video


What is iReport ?

I was searching about what is the best Reporting tool and i found that iReport is one of the most amazing reporting tools and it is (OPEN SOURCE)


So Let's talk about it ..

First see some of its results











 






This Link have avery thing about iReport :
http://community.jaspersoft.com/project/ireport-designer

----
Do you interisting in using it ?

see this video

What is ERP ?



See This Image and you will understand what is ERP (Entrprise Resource Planning)..





and see this too



 
Now is it clear ?

let's see the what wikipedia say about ERP ..

Enterprise resource planning (ERP) systems integrate internal and external management of information across an entire organization—embracing finance/accounting, manufacturing, sales and service, customer relationship management, etc. ERP systems automate this activity with an integrated software application. ERP facilitates information flow between all business functions inside the organization, and manages connections to outside stakeholders.
Enterprise system software is a multi-billion dollar industry that produces components that support a variety of business functions. IT investments have become the largest category of capital expenditure in United States-based businesses over the past decade. Enterprise systems are complex software packages that offer the potential of integrating data and processes across functions in an enterprise.
The main example is ERP systems. Organizations consider the ERP system their backbone, and a vital organizational tool because it integrates varied organizational systems, and enables flawless transactions and production. However, an ERP system is radically different from traditional systems development.ERP systems can run on a variety of computer hardware and network configurations, typically employing a database as a repository for information.

------------

and lt's see some criative videos about it






See this too