GLOSSARY

AI is the discipline that studies if and how humans can reproduce the most complex mental processes by using a computer. This research is based on two complementary pathways: First, AI attempts to bring computers closer to the capabilities of human intelligence; Second, AI uses computer simulations to make hypotheses about the mechanisms used by the human mind. The system provides: the acquisition and processing of data and information based on appropriate models; a hardware and software environment that ensures processing execution and interaction with the outside; a model of goals and constraints of the system, which can eventually adapt to the surrounding environment. AI is characterized by the interest in the aspects of: environmental perception, for example by processing signals from sensors of various types to extract the useful elements of decisions or understanding; interaction with the environment, for example through man-machine interfaces based on mechanisms of understanding natural language, manuscripts, voice or image signals; learning with resulting in change in behavior over time; representation of knowledge, for effective interaction with the environment, facilitating analysis and an effective solution of decision making; problem solving, even of an unstructured type that requires the processing of information in symbolic form; and, in particular, decision making.

In Neuroscience, Neural Network (or Neuronal Network) is used as a reference to a network or circuit of neurons. They are often identified as groups of neurons that perform a certain physiological function. Artificial Neural Networks (Artificial Neural Networks) are data processing systems, and can be presented as algorithms or hardware. It is a mathematical/computational model of calculus inspired by biological neural networks, it is made up of numerous elements called “artificial neurons”. This adaptive system changes its structure along with the flow of external or internal information flowing through the network during the learning phase. ANNs are modeled on the neural structure of the mammalian cerebral cortex, except on a much smaller scale and is typically arranged in layers. Each layer is composed of a large number of interconnected nodes which process the received signals and transmit the result to the next node. Basically, an artificial neural network can be used to solve four categories of problems: classify data in various groups; recognize regularity, patterns and templates within a large mass of data; make predictions based on input data in its possession; optimize a result already obtained by other means. By its very nature, an artificial neural network can provide very accurate results from a wide range of variable input data.

Business-to-business refers to commercial and electronic transactions between companies. This is distinguished from those with consumer (see b2c) and those with government (b2g). The volume of B2B transactions is much higher than B2C, because companies must buy all the raw materials and assets necessary to produce a product, whereas customers solely buy the product.

Business-to-consumer indicates the relationship that an enterprise holds with its customers, both sales and service.

Big Data is a term that refers to a very large, complex and fast data collection that traditional analytics programs cannot process to gain value and is therefore growing data and innumerable varieties of datas, but the term Big Data refers not to data; but especially to the ability to use all of this information to process, analyze and find objective validations on different themes. Today, with a simple algorithm, this information can be processed within a few hours, even using a simple laptop to access the analysis platform. This is the Big Data revolution. Therefore Big Data represents the new ability to link information to provide a visual approach to data, suggesting patterns and interpretation models so far unimaginable. Big Data not only affects the IT industry, but Information Technology represents only the first application for Big Data, such as through: cloud computing, search algorithms and so on. Big Data is also needed and useful in the most diverse business markets, from cars to medicine, trade to astronomy, biology to pharmaceutical chemistry, finance to gaming.
No industry in which marketing exists and data needs to be analyzed can can say that they are not interested in Big Data.

The Cloud is nothing more than a personal storage space, sometimes called cloud storage; it is accessible at any time or place, as long as the user has internet connection. In addition to this definition, The Cloud can also refer to other services offered by cloud computing. The Cloud synchronizes all of the user’s files in one place, with the consequent benefit of re-downloading, editing, deleting, and/or updating them (there is no longer the need to carry external hard
drives, USB pen drives, or anything else that may be normally missed or forgotten). Additionally, it also gives the user the ability to make backup copies, and allows the user to share any files connected to the Cloud with whomever; this can be done for an indefinite period of time at the user’s convenience
Cloud Computing is the distribution of computing services such as: servers, storage resources, databases, networking, software, analysis and more; via internet (the Cloud). Companies that offer these computing services are called cloud service providers and typically charge a cost for cloud computing services based on their usage, similar to domestic water or electricity charges. Cloud computing represents a big change compared to the traditional vision of IT resource companies. These are the common reasons why organizations use cloud computing services:

  • Cloud computing eliminates capital costs associated with hardware and software purchases and the configuration and management of local data centers that require server racks, constant electricity for power and cooling, and IT experts for infrastructure management;
  • Most cloud computing services are provided in self-service and on-demand mode, so you can also provision large amounts of computing resources in minutes, typically with just a few clicks of a mouse, and this gives the companies exceptional flexibility without the need to plan capacity;
  • The benefits of cloud computing services include the ability to enjoy elastic scalability. For cloud, this means providing the right amount of IT resources. For example it can provide a greater or lesser amount of computing power, storage resources, and bandwidth, depending on when it is needed and by the appropriate geographic location;
  • Local data centers require typically rack space and server stacking as well as hardware configuration, software patch application, and other time-consuming IT management tasks. Cloud computing eliminates the need for many of these activities;
  • The largest cloud computing services are run on a world-class secure data center, regularly updated to the latest generation of hardware, fast and efficient. This offers several advantages over a single business data center, including reduced network latency for applications and greater economies of scale;
  • Cloud computing increases simplicity and reduces data backup, emergency recovery, and enterprise continuity, thanks to the ability of mirroring data on multiple redundant sites in the cloud service provider’s network.

This is a systematic and methodological approach to transforming data into innovation. The data is the information container, but if combined with statistics and technologies it is able to provide the relevant information needed. From mathematical-statistical information, knowledge is transferred to the information, thanks to the contribution of experts who, through the data, are able to formulate new hypotheses and new relationships between them. Innovation driven, by the added value of big data, is therefore the key element not only of companies, but of any manufacturing and service sector. For example, city traffic data could be used by engineers to produce apps or other types of solutions that could provide more efficient city travel. Every day, users produce data, most of which however (especially in Italy), is still not being collected and used entirely because of the lack of skills and technologies needed.

Data mining is the process of extracting knowledge from large databases by applying algorithms that identify “hidden” associations between information. In other words, data mining is the application of one or more techniques that allow large amounts of data to be explored with the aim of identifying the most relevant information and making it available and directly usable for decision making phase. Knowledge extraction (meaningful information) occurs by identifying associations, patterns, repeated sequences, or regularity hidden in the data. In this context, a pattern means a structure, a template, or, in general, a synthetic representation of data. The term data mining is used as a synonym for “knowledge discovery in databases” (KDD).

Fuzzy means uncertain, approximate, so the theory of “fuzzy sets” originates from the fact that these systems allow gradual and continuous transitions from zero to one, that is, whether or not belonging to a certain element is not necessarily defined outside/inside, but can assume a decimal value. A classic example: the young attribute is something personal, not deterministic, “uncertain” with respect to the age variable, in fact most people agree that a person under the age of 18 is young and as this number grows, the person is getting older. So belonging to the youth group can be determined with a function of the age variable that assumes one value up to 18, then decreases to zero (not mandatory). A 15-year-old is young with value one, a twenty-one-year-old with a value of 0.8, a forty-year-old 0.1, and so on. Usually, for simplicity, you use trapeze or triangle functions. The most frequent application finds room for the controls, especially for regulators (speed, temperature, etc.), in fact it is very arbitrary (within certain limits) to say whether the speed is high or low. Much simpler is to say whether it is certainly high, or certainly not. Then, graphs are constructed according to the independent variable (and measurable) variable, so to know at what level a certain element belongs (or, since it can be young and adult at the same time), the sum of the membership for each value must be one. So the rules of belonging are written.

The industry 4.0 or the fourth industrial revolution aims to fully automate and interconnect industrial production. In pointing to these two key principles, new digital technologies will have a profound impact on four developmental guidelines: the first concerns data usage, computing power and connectivity, and it is applied in Big Data, open data, Internet of Things, machine-to-machine and cloud computing for centralization of information and their preservation; The second is analytics: Once users have collected the data, users have to gain value. Today, only 1% of the data collected is used by businesses, which may benefit from “machine learning” (machines that improve their yield by “learning” data collected and analyzed); The third development leader is the interaction between man and machine, which involves more and more widespread “touch” interfaces, and increased reality; Finally, there is the whole industry that deals with the transition from digital to “real”, which includes additive manufacturing, 3D printing, robotics, communications, machine-to-machine interactions and new technologies for storing and use energy in a targeted way, streamlining costs and optimizing performance.

Machine-to-machine, is a term referring to technologies or services that makes an automatic machine-to-machine exchange with or without human interaction. Some examples are: warehouse management equipment, sensors equipment or localization equipment. These m2m communications can also occur through Internet Protocol (IP) and are therefore associated with the Internet of Things (IoT), where objects “become recognizable and acquire intelligence by being able to communicate data of themselves and access to aggregate information from others”.

Software as a service allows users to connect to cloud-based apps over the internet and use such apps. Common examples are email, calendars, and productivity tools. The SaaS model offers a complete software solution which you can purchase from a Cloud service provider. Hire the use of an organization app, and users can connect to the app via the Internet, usually with a web browser. The underlying infrastructure, middleware, app software, and appdata are all found in the service provider’s data center. The service provider manages hardware and software and, with the appropriate service contract, ensures the availability and security of the app and data. The SaaS model allows all types of organizations to quickly operate with a minimal startup cost. These are some benefits of using: access to sophisticated applications; To provide SaaS apps to users; users do not need to buy, install, upgrade, or manage hardware, middleware, or software; the most sophisticated business applications, such as ERP and CRM, become affordable for organizations that do not have the resources to purchase, distribute, and manage the required infrastructure and software; Users pay only for the resources they use; users can also save money as the SaaS service offers automatic vertical scalability based on usage level; Use of free client software; Users can run most SaaS apps directly from the web browser without having to download and install any software, though some apps require plugins; users do not have to buy and install special software for company employees; Simple workforce mobility; The SaaS model allows you to easily secure mobility to the workforce because users can access data and SaaS apps from any mobile device or computer connected to the Internet; users do not have to worry about developing apps for running on different types of computers and devices because the service provider has already thought about it; users do not need staff with a special experience to handle security related issues related to mobile devices; A careful choice of service providers will ensure data security, regardless of the type of device they are used; Accessing app data from anywhere; With data stored in the cloud, users can access information from any mobile device or computer connected to the internet; when an app’s data is stored in the cloud, it should not be lost in the event of a problem with the computer or device of a user.

The Internet of Everything, can be seen as an implementation of the Internet of Things. If in the last case the use of the telematic network to communicate and exchange data is only and exclusively electronic devices, the IoE involves the interconnection of devices (smartphones, tablets, smartwatches, fitness trackers and wearers device in general, smart TVs, home appliances and more), people, processes, and data. At the base of everything, there will be a smart network that can hear, learn and respond to offer new services and features that will ensure greater security, simplicity, and reliability in the most different areas. The IoE will involve not only businesses, but also the personal and social range. Tt will have more than 50 billion devices connected that will form a network of billions of sensors able to record and share every single event they witness, bringing enormous economic benefits and optimizing each process. Internet of Everything means that the whole world becomes communicative thanks to new technologies (with or without wires), whether it be cars, buildings, plants, product packs, stereo speakers, glasses. In short, all animated or inanimate objects can communicate with real people. Thanks to sensors, sim cards, RFID tags and (more and more) through our smartphones. The addition of a new intelligent component will allow objects to become real nodes of communication that travel through the World Wide Web.
The Internet of Everything will change everything and as new people, processes, data, and things will be able to connect and interact with the network, new info and services become available that otherwise could not be possible.

The Internet of Things is the interconnection of physical devices, vehicles, buildings or other objects; integrated with electronic components, software, sensors and connectivity; which allows these objects to collect and exchange data. The goal of connected objects is to simplify life by automating processes or providing information we did not previously have. Today, Internet-connected objects are between 5 and 10 billion, but estimates are expected to reach 25 billion by 2020.

An interactive display for general use that allows an employee to clock in and out of work, allows a visitor to websites regarding information about the company, display important information not directly pertaining to the company, etc; the computer is generally accessed by standing and is in a common area/lobby. A totem generally has a set amount of functions, much less than a normal device would have.

Workflow is the total or partial automation of any business process, or in other words, a passage of documents or information through well-defined rules. A workflow participant is a business resource that carries out a task associated with a particular business. A resource can be a human resource, that is, a single individual or a set of people sharing a set of tasks to be carried out that interacts with a software application and/or specific hardware.

Address

Via Montagnano, 34
00040 Ardea (RM) – Italy
P.IVA C.F. 11848161003

Contacts

Phone: (+39) 06.919.9191
e-mail: info@welcome2.it

®Welcome2 - Powered by F5Grafica.com