22 resultados para Consumerization of IT


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Estudio de la eficiencia en la reducción del número de términos empleados en los léxicos de respuesta emocional del consumidor: aplicación en cerveza

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Los sistemas transaccionales tales como los programas informáticos para la planificación de recursos empresariales (ERP software) se han implementado ampliamente mientras que los sistemas analíticos para la gestión de la cadena de suministro (SCM software) no han tenido el éxito deseado por la industria de tecnología de información (TI). Aunque se documentan beneficios importantes derivados de las implantaciones de SCM software, las empresas industriales son reacias a invertir en este tipo de sistemas. Por una parte esto es debido a la falta de métodos que son capaces de detectar los beneficios por emplear esos sistemas, y por otra parte porque el coste asociado no está identificado, detallado y cuantificado suficientemente. Los esquemas de coordinación basados únicamente en sistemas ERP son alternativas válidas en la práctica industrial siempre que la relación coste-beneficio esta favorable. Por lo tanto, la evaluación de formas organizativas teniendo en cuenta explícitamente el coste debido a procesos administrativos, en particular por ciclos iterativos, es de gran interés para la toma de decisiones en el ámbito de inversiones en TI. Con el fin de cerrar la brecha, el propósito de esta investigación es proporcionar métodos de evaluación que permitan la comparación de diferentes formas de organización y niveles de soporte por sistemas informáticos. La tesis proporciona una amplia introducción, analizando los retos a los que se enfrenta la industria. Concluye con las necesidades de la industria de SCM software: unas herramientas que facilitan la evaluación integral de diferentes propuestas de organización. A continuación, la terminología clave se detalla centrándose en la teoría de la organización, las peculiaridades de inversión en TI y la tipología de software de gestión de la cadena de suministro. La revisión de la literatura clasifica las contribuciones recientes sobre la gestión de la cadena de suministro, tratando ambos conceptos, el diseño de la organización y su soporte por las TI. La clasificación incluye criterios relacionados con la metodología de la investigación y su contenido. Los estudios empíricos en el ámbito de la administración de empresas se centran en tipologías de redes industriales. Nuevos algoritmos de planificación y esquemas de coordinación innovadoras se desarrollan principalmente en el campo de la investigación de operaciones con el fin de proponer nuevas funciones de software. Artículos procedentes del área de la gestión de la producción se centran en el análisis de coste y beneficio de las implantaciones de sistemas. La revisión de la literatura revela que el éxito de las TI para la coordinación de redes industriales depende en gran medida de características de tres dimensiones: la configuración de la red industrial, los esquemas de coordinación y las funcionalidades del software. La literatura disponible está enfocada sobre todo en los beneficios de las implantaciones de SCM software. Sin embargo, la coordinación de la cadena de suministro, basándose en el sistema ERP, sigue siendo la práctica industrial generalizada, pero el coste de coordinación asociado no ha sido abordado por los investigadores. Los fundamentos de diseño organizativo eficiente se explican en detalle en la medida necesaria para la comprensión de la síntesis de las diferentes formas de organización. Se han generado varios esquemas de coordinación variando los siguientes parámetros de diseño: la estructura organizativa, los mecanismos de coordinación y el soporte por TI. Las diferentes propuestas de organización desarrolladas son evaluadas por un método heurístico y otro basado en la simulación por eventos discretos. Para ambos métodos, se tienen en cuenta los principios de la teoría de la organización. La falta de rendimiento empresarial se debe a las dependencias entre actividades que no se gestionan adecuadamente. Dentro del método heurístico, se clasifican las dependencias y se mide su intensidad basándose en factores contextuales. A continuación, se valora la idoneidad de cada elemento de diseño organizativo para cada dependencia específica. Por último, cada forma de organización se evalúa basándose en la contribución de los elementos de diseño tanto al beneficio como al coste. El beneficio de coordinación se refiere a la mejora en el rendimiento logístico - este concepto es el objeto central en la mayoría de modelos de evaluación de la gestión de la cadena de suministro. Por el contrario, el coste de coordinación que se debe incurrir para lograr beneficios no se suele considerar en detalle. Procesos iterativos son costosos si se ejecutan manualmente. Este es el caso cuando SCM software no está implementada y el sistema ERP es el único instrumento de coordinación disponible. El modelo heurístico proporciona un procedimiento simplificado para la clasificación sistemática de las dependencias, la cuantificación de los factores de influencia y la identificación de configuraciones que indican el uso de formas organizativas y de soporte de TI más o menos complejas. La simulación de eventos discretos se aplica en el segundo modelo de evaluación utilizando el paquete de software ‘Plant Simulation’. Con respecto al rendimiento logístico, por un lado se mide el coste de fabricación, de inventario y de transporte y las penalizaciones por pérdida de ventas. Por otro lado, se cuantifica explícitamente el coste de la coordinación teniendo en cuenta los ciclos de coordinación iterativos. El método se aplica a una configuración de cadena de suministro ejemplar considerando diversos parámetros. Los resultados de la simulación confirman que, en la mayoría de los casos, el beneficio aumenta cuando se intensifica la coordinación. Sin embargo, en ciertas situaciones en las que se aplican ciclos de planificación manuales e iterativos el coste de coordinación adicional no siempre conduce a mejor rendimiento logístico. Estos resultados inesperados no se pueden atribuir a ningún parámetro particular. La investigación confirma la gran importancia de nuevas dimensiones hasta ahora ignoradas en la evaluación de propuestas organizativas y herramientas de TI. A través del método heurístico se puede comparar de forma rápida, pero sólo aproximada, la eficiencia de diferentes formas de organización. Por el contrario, el método de simulación es más complejo pero da resultados más detallados, teniendo en cuenta parámetros específicos del contexto del caso concreto y del diseño organizativo. ABSTRACT Transactional systems such as Enterprise Resource Planning (ERP) systems have been implemented widely while analytical software like Supply Chain Management (SCM) add-ons are adopted less by manufacturing companies. Although significant benefits are reported stemming from SCM software implementations, companies are reluctant to invest in such systems. On the one hand this is due to the lack of methods that are able to detect benefits from the use of SCM software and on the other hand associated costs are not identified, detailed and quantified sufficiently. Coordination schemes based only on ERP systems are valid alternatives in industrial practice because significant investment in IT can be avoided. Therefore, the evaluation of these coordination procedures, in particular the cost due to iterations, is of high managerial interest and corresponding methods are comprehensive tools for strategic IT decision making. The purpose of this research is to provide evaluation methods that allow the comparison of different organizational forms and software support levels. The research begins with a comprehensive introduction dealing with the business environment that industrial networks are facing and concludes highlighting the challenges for the supply chain software industry. Afterwards, the central terminology is addressed, focusing on organization theory, IT investment peculiarities and supply chain management software typology. The literature review classifies recent supply chain management research referring to organizational design and its software support. The classification encompasses criteria related to research methodology and content. Empirical studies from management science focus on network types and organizational fit. Novel planning algorithms and innovative coordination schemes are developed mostly in the field of operations research in order to propose new software features. Operations and production management researchers realize cost-benefit analysis of IT software implementations. The literature review reveals that the success of software solutions for network coordination depends strongly on the fit of three dimensions: network configuration, coordination scheme and software functionality. Reviewed literature is mostly centered on the benefits of SCM software implementations. However, ERP system based supply chain coordination is still widespread industrial practice but the associated coordination cost has not been addressed by researchers. Fundamentals of efficient organizational design are explained in detail as far as required for the understanding of the synthesis of different organizational forms. Several coordination schemes have been shaped through the variation of the following design parameters: organizational structuring, coordination mechanisms and software support. The different organizational proposals are evaluated using a heuristic approach and a simulation-based method. For both cases, the principles of organization theory are respected. A lack of performance is due to dependencies between activities which are not managed properly. Therefore, within the heuristic method, dependencies are classified and their intensity is measured based on contextual factors. Afterwards the suitability of each organizational design element for the management of a specific dependency is determined. Finally, each organizational form is evaluated based on the contribution of the sum of design elements to coordination benefit and to coordination cost. Coordination benefit refers to improvement in logistic performance – this is the core concept of most supply chain evaluation models. Unfortunately, coordination cost which must be incurred to achieve benefits is usually not considered in detail. Iterative processes are costly when manually executed. This is the case when SCM software is not implemented and the ERP system is the only available coordination instrument. The heuristic model provides a simplified procedure for the classification of dependencies, quantification of influence factors and systematic search for adequate organizational forms and IT support. Discrete event simulation is applied in the second evaluation model using the software package ‘Plant Simulation’. On the one hand logistic performance is measured by manufacturing, inventory and transportation cost and penalties for lost sales. On the other hand coordination cost is explicitly considered taking into account iterative coordination cycles. The method is applied to an exemplary supply chain configuration considering various parameter settings. The simulation results confirm that, in most cases, benefit increases when coordination is intensified. However, in some situations when manual, iterative planning cycles are applied, additional coordination cost does not always lead to improved logistic performance. These unexpected results cannot be attributed to any particular parameter. The research confirms the great importance of up to now disregarded dimensions when evaluating SCM concepts and IT tools. The heuristic method provides a quick, but only approximate comparison of coordination efficiency for different organizational forms. In contrast, the more complex simulation method delivers detailed results taking into consideration specific parameter settings of network context and organizational design.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Three methodologies to assess As bioaccessibility were evaluated using playgroundsoil collected from 16 playgrounds in Madrid, Spain: two (Simplified Bioaccessibility Extraction Test: SBET, and hydrochloric acid-extraction: HCl) assess gastric-only bioaccessibility and the third (Physiologically Based Extraction Test: PBET) evaluates mouth–gastric–intestinal bioaccessibility. Aqua regia-extractable (pseudo total) As contents, which are routinely employed in riskassessments, were used as the reference to establish the following percentages of bioaccessibility: SBET – 63.1; HCl – 51.8; PBET – 41.6, the highest values associated with the gastric-only extractions. For Madridplaygroundsoils – characterised by a very uniform, weakly alkaline pH, and low Fe oxide and organic matter contents – the statistical analysis of the results indicates that, in contrast with other studies, the highest percentage of As in the samples was bound to carbonates and/or present as calcium arsenate. As opposed to the As bound to Fe oxides, this As is readily released in the gastric environment as the carbonate matrix is decomposed and calcium arsenate is dissolved, but some of it is subsequently sequestered in unavailable forms as the pH is raised to 5.5 to mimic intestinal conditions. The HCl extraction can be used as a simple and reliable (i.e. low residual standard error) proxy for the more expensive, time consuming, and error-prone PBET methodology. The HCl method would essentially halve the estimate of carcinogenic risk for children playing in Madridplaygroundsoils, providing a more representative value of associated risk than the pseudo-total concentrations used at present

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The intense activity in the construction sector during the last decade has generated huge volumes of construction and demolition (C&D) waste. In average, Europe has generated around 890 million tonnes of construction and demolition waste per year. Although now the activity has entered in a phase of decline, due to the change of the economic cycle, we don’t have to forget all the problems caused by such waste, or rather, by their management which is still far from achieving the overall target of 70% for C&D waste --excludes soil and stones not containing dangerous substances-- should be recycled in the EU Countries by 2020 (Waste Framework Directive). But in fact, the reality is that only 50% of the C&D waste generated in EU is recycled and 40% of it corresponds to the recycling of soil and stones not containing dangerous substances. Aware of this situation, the European Countries are implementing national policies as well as different measures to prevent the waste that can be avoidable and to promote measures to increase recycling and recovering. In this aspect, this article gives an overview of the amount of C&D waste generated in European countries, as well as the amount of this waste that is being recycled and the different measures that European countries have applied to solve this situation.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Since its inception, Wikipedia has grown to a solid and stable project and turned into a mass collaboration tool that allows the sharing and distribution of knowledge. The wiki approach that basis this initiative promotes the participation and collaboration of users. In addition to visits for browsing its contents, Wikipedia also receives the contributions of users to improve them. In the past, researchers paid attention to different aspects concerning authoring and quality of contents. However, little effort has been made to study the nature of the visits that Wikipedia receives. We conduct such an study using a sample of users' requests provided by the Wikimedia Foundation in the form of Squid log lines. Our sample contains more that 14,000 million requests from users all around the world and directed to all the projects maintained by the Wikimedia Foundation, including different editions of Wikipedia. This papers describes the work made to characterize the traffic directed to Wikipedia and consisting of the requests sent by its users. Our main aim is to obtain a detailed description of its composition in terms of the percentages corresponding to the different types of requests making part of it. The benefits from our work may range from the prediction of traffic peaks to the determination of the kind of resources most often requested, which can be useful for scalability considerations.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

After more than a decade of development work and hopes, the usage of mobile Internet has finally taken off. Now, we are witnessing the first signs of evidence of what might become the explosion of mobile content and applications that will be shaping the (mobile) Internet of the future. Similar to the wired Internet, search will become very relevant for the usage of mobile Internet. Current research on mobile search has applied a limited set of methodologies and has also generated a narrow outcome of meaningful results. This article covers new ground, exploring the use and visions of mobile search with a users' interview-based qualitative study. Its main conclusion builds upon the hypothesis that mobile search is sensitive to a mobile logic different than today's one. First, (advanced) users ask for accessing with their mobile devices the entire Internet, rather than subsections of it. Second, success is based on new added-value applications that exploit unique mobile functionalities. The authors interpret that such mobile logic involves fundamentally the use of personalised and context-based services.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

“Teamwork” is one of the abilities most valued by employers. In [16] we describe the process of adapting to the ECTS methodologies (for ongoing assessment), a course in computer programming for students in a technical degree (Marine Engineering, UPM) not specifically dedicated to computing. As a further step in this process we have emphasized cooperative learning. For this, the students were paired and the work of each pair was evaluated via surprise tests taken and graded jointly, and constituting a substantial part of the final grade. Here we document this experience, discussing methodological aspects, describing indicators for measuring the impact of these methodologies on the educational experience, and reporting on the students’ opinion of it.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper presents a comparison of acquisition models related to decision analysis of IT supplier selection. The main standards are: Capability Maturity Model Integration for Acquisition (CMMI-ACQ), ISO / IEC 12207 Information Technology / Software Life Cycle Processes, IEEE 1062 Recommended Practice for Software Acquisition, the IT Infrastructure Library (ITIL) and the Project Management Body of Knowledge (PMBOK) guide. The objective of this paper is to compare the previous models to find the advantages and disadvantages of them for the future development of a decision model for IT supplier selection.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The Networks of Evolutionary Processors (NEPs) are computing mechanisms directly inspired from the behavior of cell populations more specifically the point mutations in DNA strands. These mechanisms are been used for solving NP-complete problems by means of a parallel computation postulation. This paper describes an implementation of the basic model of NEP using Web technologies and includes the possibility of designing some of the most common variants of it by means the use of the web page design which eases the configuration of a given problem. It is a system intended to be used in a multicore processor in order to benefit from the multi thread use.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In the present competitive environment, companies are wondering how to reduce their IT costs while increasing their efficiency and agility to react when changes in the business processes are required. Cloud Computing is the latest paradigm to optimize the use of IT resources considering ?everything as a service? and receiving these services from the Cloud (Internet) instead of owning and managing hardware and software assets. The benefits from the model are clear. However, there are also concerns and issues to be solved before Cloud Computing spreads across the different industries. This model will allow a pay-per-use model for the IT services and many benefits like cost savings, agility to react when business demands changes and simplicity because there will not be any infrastructure to operate and administrate. It will be comparable to the well known utilities like electricity, water or gas companies. However, this paper underlines several risk factors of the model. Leading technology companies should research on solutions to minimize the risks described in this article. Keywords - Cloud Computing, Utility Computing, Elastic Computing, Enterprise Agility

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Many attempts have been made to provide multilinguality to the Semantic Web, by means of annotation properties in Natural Language (NL), such as RDFs or SKOS labels, and other lexicon-ontology models, such as lemon, but there are still many issues to be solved if we want to have a truly accessible Multilingual Semantic Web (MSW). Reusability of monolingual resources (ontologies, lexicons, etc.), accessibility of multilingual resources hindered by many formats, reliability of ontological sources, disambiguation problems and multilingual presentation to the end user of all this information in NL can be mentioned as some of the most relevant problems. Unless this NL presentation is achieved, MSW will be restricted to the limits of IT experts, but even so, with great dissatisfaction and disenchantment

Relevância:

90.00% 90.00%

Publicador:

Resumo:

As a consequence of cinema screens being placed in front of screen-speakers, a reduction in sound quality has been noticed. Cinema screens not only let the sound go through them, but also absorb a small amount of it and reflect the sound which impacts on the screen to the back, coming forward again in case it impacts on the loudspeaker. This backwards reflection in addition to the signal coming from the loudspeaker can lead to constructive or destructive interference at certain frequencies which usually results in comb filtering. In this project, this effect has been studied through researching amongst various data sheet provided by different manufacturers, acoustical measurements completed in the large anechoic chamber of the ISVR and some theoretical models developed with MatLab software. If results obtained with MatLab are accurate enough in comparison to the real measurements taken in the anechoic chamber this would lead to a good way to predict which would be the attenuation added to the system at each frequency, given that not all manufacturers provide an attenuation curve, but only an average attenuation. This average attenuation might be useless as sound waves have different wavelengths and its propagation through partitions varies. In fact, sound is composed by high and low frequencies, where high frequencies are characterised by a small wavelength which is usually easier to attenuate than low frequencies that characterised by bigger wavelengths. Furthermore, this information would be of great value to both screen manufacturers, who could offer a much more precise data in their data sheets; and customers, who would have a great amount of information to their disposal before purchasing and installing anything in their cinemas, being able to know by themselves which screen or loudspeaker should be best to meet their expectative. RESUMEN. La aparición de la digitalización de las bandas sonoras para las películas hace posible la mejora en la calidad de sonido de los cines. Sin embargo, un aspecto a tener en cuenta en esta calidad del sonido es la transmisión de éste a través de la pantalla, ya que normalmente tras ella se encuentran situados los altavoces. Las propiedades acústicas varían dependiendo del tipo de pantalla que se utilice, además de haber poca información a la que acceder para poder valorar su comportamiento. A lo largo de este proyecto, se analizan tres muestras de pantallas distintas donadas por distintos fabricantes para poder llegar a la conclusión de dependiendo del tipo de pantalla cuál es la distancia óptima a la que localizar la pantalla respecto al altavoz y con qué inclinación. Dicho análisis se realizó en la cámara anecoica del ISVR (University of Southampton) mediante la construcción de un marco de madera de 2x2 m en el que tensar las pantallas de cine, y un altavoz cuyo comportamiento sea el más similar al de los altavoces de pantalla reales. Los datos se captaron mediante cuatro micrófonos colocados en posiciones distintas y conectados al software Pulse de Brüel & Kjær, a través del cual se obtuvieron las respuestas en frecuencia del altavoz sin pantalla y con ella a diferentes distancias del altavoz. Posteriormente, los datos se analizaron con MatLab donde se calculó la atenuación, el factor de transmisión de la presión (PTF) y el análisis cepstrum. Finalmente, se realizó un modelo teórico del comportamiento de las pantallas perforadas basado en las placas perforadas utilizadas para atenuar el sonido entre distintas habitaciones. Como conclusión se llegó a que las pantallas curvadas son acústicamente más transparentes que las pantallas perforadas que a partir de 6 kHz son más acústicamente opacas. En las pantallas perforadas la atenuación depende del número de perforaciones por unidad de área y el diámetro de éstas. Dicha atenuación se reducirá si se reduce el diámetro de las perforaciones de la pantalla, o si se incrementa la cantidad de perforaciones. Acerca del efecto filtro peine, para obtener la mínima amplitud de éste la pantalla se deberá situar a una distancia entre 15 y 30 cm del altavoz, encontrando a la distancia de 30 cm que la última reflexión analizada a través de Cepstrum llega 5 ms más tarde que la señal directa, por lo cual no debería dañar el sonido ni la claridad del habla.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

En los últimos años la externalización de TI ha ganado mucha importancia en el mercado y, por ejemplo, el mercado externalización de servicios de TI sigue creciendo cada año. Ahora más que nunca, las organizaciones son cada vez más los compradores de las capacidades necesarias mediante la obtención de productos y servicios de los proveedores, desarrollando cada vez menos estas capacidades dentro de la empresa. La selección de proveedores de TI es un problema de decisión complejo. Los gerentes que enfrentan una decisión sobre la selección de proveedores de TI tienen dificultades en la elaboración de lo que hay que pensar, además en sus discursos. También de acuerdo con un estudio del SEI (Software Engineering Institute) [40], del 20 al 25 por ciento de los grandes proyectos de adquisición de TI fracasan en dos años y el 50 por ciento fracasan dentro de cinco años. La mala gestión, la mala definición de requisitos, la falta de evaluaciones exhaustivas, que pueden ser utilizadas para llegar a los mejores candidatos para la contratación externa, la selección de proveedores y los procesos de contratación inadecuados, la insuficiencia de procedimientos de selección tecnológicos, y los cambios de requisitos no controlados son factores que contribuyen al fracaso del proyecto. La mayoría de los fracasos podrían evitarse si el cliente aprendiese a comprender los problemas de decisión, hacer un mejor análisis de decisiones, y el buen juicio. El objetivo principal de este trabajo es el desarrollo de un modelo de decisión para la selección de proveedores de TI que tratará de reducir la cantidad de fracasos observados en las relaciones entre el cliente y el proveedor. La mayor parte de estos fracasos son causados por una mala selección, por parte del cliente, del proveedor. Además de estos problemas mostrados anteriormente, la motivación para crear este trabajo es la inexistencia de cualquier modelo de decisión basado en un multi modelo (mezcla de modelos adquisición y métodos de decisión) para el problema de la selección de proveedores de TI. En el caso de estudio, nueve empresas españolas fueron analizadas de acuerdo con el modelo de decisión para la selección de proveedores de TI desarrollado en este trabajo. Dos softwares se utilizaron en este estudio de caso: Expert Choice, y D-Sight. ABSTRACT In the past few years IT outsourcing has gained a lot of importance in the market and, for example, the IT services outsourcing market is still growing every year. Now more than ever, organizations are increasingly becoming acquirers of needed capabilities by obtaining products and services from suppliers and developing less and less of these capabilities in-house. IT supplier selection is a complex and opaque decision problem. Managers facing a decision about IT supplier selection have difficulty in framing what needs to be thought about further in their discourses. Also according to a study from SEI (Software Engineering Institute) [40], 20 to 25 percent of large information technology (IT) acquisition projects fail within two years and 50 percent fail within five years. Mismanagement, poor requirements definition, lack of comprehensive evaluations, which can be used to come up with the best candidates for outsourcing, inadequate supplier selection and contracting processes, insufficient technology selection procedures, and uncontrolled requirements changes are factors that contribute to project failure. The majority of project failures could be avoided if the acquirer learns how to understand the decision problems, make better decision analysis, and good judgment. The main objective of this work is the development of a decision model for IT supplier selection that will try to decrease the amount of failures seen in the relationships between the client-supplier. Most of these failures are caused by a not well selection of the supplier. Besides these problems showed above, the motivation to create this work is the inexistence of any decision model based on multi model (mixture of acquisition models and decision methods) for the problem of IT supplier selection. In the case study, nine different Spanish companies were analyzed based on the IT supplier selection decision model developed in this work. Two software products were used in this case study, Expert Choice and D-Sight.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The electronic and mechanical media such as film, television, photography, offset, are just examples of how fast and important the technological development had become in society. Nevertheless the outcoming technologies and the continuous development had provided newer and better possibilities every time for having advanced services. Nowadays multi-view video has been developed with different tools and applications, having as main goal to be more innovative and bring within technical offerings in a friendly for all users in general, in terms of managing and accessibility (just internet connection is needed). The intention of all technologies is to generate an innovation in order to gain more users and start being popular, therefore is important to realize an implementation in this case. In such terms realizing about the outreach that Multi View Video, an importance to become more global in this days, an application that supports this aim such as the possibility of language selection within the use of a same scenario has been realized. Finally is important to point out that thanks to the Multi View Video's continuous progress in technology a more intercultural market will be reachable, making of it a shared society growth on the world's global development. � ��� ���� ������� ��� �� ��� ��� �������� ��� ���� ��� ��� ������ ���������� � ���� � �� ���� ���� � ���� �� � � ���� � � ��� ��� �� ��� �� � ��� ��� ��������� �� � ����� ��������� ��� � ��� � ���� ���� ����� ����������� ��� ��� �� � ������������� �� �������� �������� ������� ������� �� ����� �������� ��� � � �� ���� �������� ���� ����� �������� �������� �� ������ ���� �� � ����������� ������������� � � ��!��� � � � �� ������� ��� ��������"������ � �� ���������� �������� ��� �� ������ � ����� ����� ��� ��� �� � �� �� ���� �� ��� �� ���� � � � �� ��� ������ �� �� ��� �� �� ��� �� � �� ��� #�� ��� ������� � ��� �� � �� ������$������� � ��� ��� # ������� � ����� ����� �� ���� �% ���% �������� ��� ����� ����������� �� ������� �� � �� ������ ��� ���� �� ��� �� � ����� �� � �� � �� ����� ��� ��� ���� � � �� ��� ��������� ����� ��� � � �� ���������������������� ����������� ��� #����& ������ �� ��� �� � ���� � ��� � �� � ���'�� �� ��� ��� � % ��� % ���(�� ��� ������ � �� ���� �� ���������� ���� �� � � ��� � ����� '� �� ��� ��� ���������� ��' ������ ������ ������ � ��� �� ����� ����� ��(������������������� ��� � �

Relevância:

90.00% 90.00%

Publicador:

Resumo:

There is no doubt that there is no possibility of finding a single reference about domotics in the first half of the 20th century. The best known authors and those who have documented this discipline, set its origin in the 1970’s, when the x-10 technology began to be used, but it was not until 1988 when Larousse Encyclopedia decided to include the definition of "Smart Building". Furthermore, even nowadays, there is not a single definition widely accepted, and for that reason, many other expressions, namely "Intelligent Buildings" "Domotics" "Digital Home" or "Home Automation" have appeared to describe the automated buildings and homes. The lack of a clear definition for "Smart Buildings" causes difficulty not only in the development of a common international framework to develop research in this field, but it also causes insecurity in the potential user of these buildings. That is to say, the user does not know what is offered by this kind of buildings, hindering the dissemination of the culture of building automation in society. Thus, the main purpose of this paper is to propose a definition of the expression “Smart Buildings” that satisfactorily describes the meaning of this discipline. To achieve this aim, a thorough review of the origin of the term itself and the historical background before the emergence of the phenomenon of domotics was conducted, followed by a critical discussion of existing definitions of the term "Smart Buildings" and other similar terms. The extent of each definition has been analyzed, inaccuracies have been discarded and commonalities have been compared. Throughout the discussion, definitions that bring the term "Smart Buildings" near to disciplines such as computer science, robotics and also telecommunications have been found. However, there are also many other definitions that emphasize in a more abstract way the role of these new buildings in the society and the future of mankind.