957 resultados para Front-end receivers


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Con el siguiente proyecto se pretende explicar cómo se realiza la integración de las técnicas de mercadeo y la relación estratégica comunitaria, debido a que las organizaciones utilizan conceptos comunitarios. Se analizan las principales estrategias de mercadeo como marketing mix, geomarketing, mercadeo de servicios, mercadeo relacional y mercadeo social. Se explican las técnicas de mercadeo como mercadeo directo, diferenciación de productos, segmentación de mercado, investigación de mercados, inteligencia de mercados, optimización de canales de distribución y comercio electrónico. Adicionalmente, se exponen las estrategias comunitarias como coaliciones comunitarias, organizaciones de base, liderazgo comunitario y empoderamiento. La metodología implementada para este proyecto es de tipo teórico-conceptual y reúne los aportes de varios documentos científicos de diversas áreas del conocimiento. Las fuentes de información, conceptos y teorías se seleccionan según el criterio del investigador en función de las posibilidades descriptivas de la integración propuesta. En esta investigación se concluye que las técnicas y las estrategias de mercadeo permiten la comunicación entre las organizaciones y las comunidades. Esto posibilita que exista participación entre ambas partes y es un factor clave para el surgimiento de la relación estratégica comunitaria. Se recomienda realizar investigaciones posteriores sobre la relación estratégica comunitaria, aplicadas a organizaciones y comunidades.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

La creciente dinamización de las IDE's genera una demanda de la construcción de Geoportales y por ende la demanda de herramientas que además de facilitar su construcción, configuración e implementación, ofrezcan la posibilidad de contratar un soporte técnico profesionalizado. OpenGeo Suite, paquete de software libre profesional e integrado, que permite desde el almacenamiento de datos geográficos, hasta su publicación utilizando estándares OGC e implementación de soluciones web GIS con librerías de código abierto Javascript. OpenGeo Suite permite un despliegue multiplataforma (Linux, Windows y OSX), con cuatro componentes de software libre fuertemente integrados basados en el uso de estándares OGC. Los componentes del lado del servidor están orientados al almacenamiento, configuración y publicación de datos por parte de usuarios técnicos en SIG: PostgreSQL+ la extensión espacial PostGIS que se encarga del almacenamiento de la información geográfica dando soporte a funciones de análisis espacial. pgAdmin como sistema de gestión de base de datos, facilitando la importación y actualización de datos. Geoserver se encarga de la publicación de la información geográfica proveniente de diferentes orígenes de datos: PostGIS, SHP, Oracle Spatial, GeoTIFF, etc. soportando la mayoría de estándares OGC de publicación de información geográfica WMS, WFS, WCS y de formatos GML, KML, GeoJSON, SLD. Además, ofrece soporte a cacheado de teselas a través de Geowebcache. OpenGeo Suite ofrece dos aplicaciones: GeoExplorer y GeoEditor, que permiten al técnico construir un Geoportal con capacidades de edición de geometrías.OpenGeo Suite ofrece una consola de administración (Dashboard) que facilita la configuración de los componentes de administración. Del lado del cliente, los componentes son librerías de desarrollo JavaScript orientadas a desarrolladores de aplicaciones Web SIG. OpenLayers con soporte para capas raster, vectoriales, estilos, proyecciones, teselado, herramientas de edición, etc. Por último, GeoExt para la construcción del front-end de Geoportales, basada en ExtJS y fuertemente acoplada a OpenLayers

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Compute grids are used widely in many areas of environmental science, but there has been limited uptake of grid computing by the climate modelling community, partly because the characteristics of many climate models make them difficult to use with popular grid middleware systems. In particular, climate models usually produce large volumes of output data, and running them usually involves complicated workflows implemented as shell scripts. For example, NEMO (Smith et al. 2008) is a state-of-the-art ocean model that is used currently for operational ocean forecasting in France, and will soon be used in the UK for both ocean forecasting and climate modelling. On a typical modern cluster, a particular one year global ocean simulation at 1-degree resolution takes about three hours when running on 40 processors, and produces roughly 20 GB of output as 50000 separate files. 50-year simulations are common, during which the model is resubmitted as a new job after each year. Running NEMO relies on a set of complicated shell scripts and command utilities for data pre-processing and post-processing prior to job resubmission. Grid Remote Execution (G-Rex) is a pure Java grid middleware system that allows scientific applications to be deployed as Web services on remote computer systems, and then launched and controlled as if they are running on the user's own computer. Although G-Rex is general purpose middleware it has two key features that make it particularly suitable for remote execution of climate models: (1) Output from the model is transferred back to the user while the run is in progress to prevent it from accumulating on the remote system and to allow the user to monitor the model; (2) The client component is a command-line program that can easily be incorporated into existing model work-flow scripts. G-Rex has a REST (Fielding, 2000) architectural style, which allows client programs to be very simple and lightweight and allows users to interact with model runs using only a basic HTTP client (such as a Web browser or the curl utility) if they wish. This design also allows for new client interfaces to be developed in other programming languages with relatively little effort. The G-Rex server is a standard Web application that runs inside a servlet container such as Apache Tomcat and is therefore easy to install and maintain by system administrators. G-Rex is employed as the middleware for the NERC1 Cluster Grid, a small grid of HPC2 clusters belonging to collaborating NERC research institutes. Currently the NEMO (Smith et al. 2008) and POLCOMS (Holt et al, 2008) ocean models are installed, and there are plans to install the Hadley Centre’s HadCM3 model for use in the decadal climate prediction project GCEP (Haines et al., 2008). The science projects involving NEMO on the Grid have a particular focus on data assimilation (Smith et al. 2008), a technique that involves constraining model simulations with observations. The POLCOMS model will play an important part in the GCOMS project (Holt et al, 2008), which aims to simulate the world’s coastal oceans. A typical use of G-Rex by a scientist to run a climate model on the NERC Cluster Grid proceeds as follows :(1) The scientist prepares input files on his or her local machine. (2) Using information provided by the Grid’s Ganglia3 monitoring system, the scientist selects an appropriate compute resource. (3) The scientist runs the relevant workflow script on his or her local machine. This is unmodified except that calls to run the model (e.g. with “mpirun”) are simply replaced with calls to "GRexRun" (4) The G-Rex middleware automatically handles the uploading of input files to the remote resource, and the downloading of output files back to the user, including their deletion from the remote system, during the run. (5) The scientist monitors the output files, using familiar analysis and visualization tools on his or her own local machine. G-Rex is well suited to climate modelling because it addresses many of the middleware usability issues that have led to limited uptake of grid computing by climate scientists. It is a lightweight, low-impact and easy-to-install solution that is currently designed for use in relatively small grids such as the NERC Cluster Grid. A current topic of research is the use of G-Rex as an easy-to-use front-end to larger-scale Grid resources such as the UK National Grid service.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper we present an architecture for network and applications management, which is based on the Active Networks paradigm and shows the advantages of network programmability. The stimulus to develop this architecture arises from an actual need to manage a cluster of active nodes, where it is often required to redeploy network assets and modify nodes connectivity. In our architecture, a remote front-end of the managing entity allows the operator to design new network topologies, to check the status of the nodes and to configure them. Moreover, the proposed framework allows to explore an active network, to monitor the active applications, to query each node and to install programmable traps. In order to take advantage of the Active Networks technology, we introduce active SNMP-like MIBs and agents, which are dynamic and programmable. The programmable management agents make tracing distributed applications a feasible task. We propose a general framework that can inter-operate with any active execution environment. In this framework, both the manager and the monitor front-ends communicate with an active node (the Active Network Access Point) through the XML language. A gateway service performs the translation of the queries from XML to an active packet language and injects the code in the network. We demonstrate the implementation of an active network gateway for PLAN (Packet Language for Active Networks) in a forty active nodes testbed. Finally, we discuss an application of the active management architecture to detect the causes of network failures by tracing network events in time.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The complexity of construction projects and the fragmentation of the construction industry undertaking those projects has effectively resulted in linear, uncoordinated and highly variable project processes in the UK construction sector. Research undertaken at the University of Salford resulted in the development of an improved project process, the Process Protocol, which considers the whole lifecycle of a construction project whilst integrating its participants under a common framework. The Process Protocol identifies the various phases of a construction project with particular emphasis on what is described in the manufacturing industry as the ‘fuzzy front end’. The participants in the process are described in terms of the activities that need to be undertaken in order to achieve a successful project and process execution. In addition, the decision-making mechanisms, from a client perspective, are illustrated and the foundations for a learning organization/industry are facilitated within a consistent Process Protocol.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Letter identification is a critical front end of the reading process. In general, conceptualizations of the identification process have emphasized arbitrary sets of distinctive features. However, a richer view of letter processing incorporates principles from the field of type design, including an emphasis on uniformities across letters within a font. The importance of uniformities is supported by a small body of research indicating that consistency of font increases letter identification efficiency. We review design concepts and the relevant literature, with the goal of stimulating further thinking about letter processing during reading.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The construction industry is widely recognised as being inherent with risk and uncertainty. This necessitates the need for effective project risk management to achieve the project objectives of time, cost and quality. A popular tool employed in projects to aid in the management of risk is a risk register. This tool documents the project risks and is often employed by the Project Manager (PM) to manage the associated risks on a project. This research aims to ascertain how widely risk registers are used by Project Managers as part of their risk management practices. To achieve this aim entailed interviewing ten PMs, to discuss their use of the risk register as a risk management tool. The results from these interviews indicated the prevalent use of this document and recognised its effectiveness in the management of project risks. The findings identified the front end and feasibility phases of a project as crucial stages for using risk registers, noting it as a vital ingredient in the risk response planning of the decision making process. Moreover, the composition of the risk register was also understood, with an insight into how PMs produce and develop this tool also ascertained. In conclusion, this research signifies the extensive use of the risk register by PMs. A majority of PMs were of the view that risk registers constitute an essential component of their project risk management practices. This suggests a need for further research on the extent to which risk registers actually help PMs to control the risks in a construction project, particularly residual risks, and how this can be improved to minimize deviations from expected outcomes.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Main Injector Neutrino Oscillation Search (MINOS) experiment uses an accelerator-produced neutrino beam to perform precision measurements of the neutrino oscillation parameters in the ""atmospheric neutrino"" sector associated with muon neutrino disappearance. This long-baseline experiment measures neutrino interactions in Fermilab`s NuMI neutrino beam with a near detector at Fermilab and again 735 km downstream with a far detector in the Soudan Underground Laboratory in northern Minnesota. The two detectors are magnetized steel-scintillator tracking calorimeters. They are designed to be as similar as possible in order to ensure that differences in detector response have minimal impact on the comparisons of event rates, energy spectra and topologies that are essential to MINOS measurements of oscillation parameters. The design, construction, calibration and performance of the far and near detectors are described in this paper. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The H.R. MacMillan Space Centre is a multi-faceted organization whose mission is to educate, inspire and evoke a sense of wonder about the universe, our planet and space exploration. As a popular, Vancouver science centre, it faces the same range of challenges and issues as other major attractions: how does the Space Centre maintain a healthy public attendance in an increasingly competitive market where visitors continue to be presented with an increasingly rich range of choices for their leisure spending and entertainment dollars?This front-end study investigated visitor attitudes, thoughts and preconceptions on the topic of space and astronomy. It also examined visitors’ motivations for coming to a space science centre. Useful insights were obtained which will be applied to improve future programme content and exhibit development.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The current paper presents a study conducted at At-Bristol Science Centre, UK. It is a front-end evaluation for the “Live Science Zone” at At-Bristol, which will be built during the autumn of 2004. It will provide a facility for programmed events and shows, non-programmed investigative activities and the choice of passive or active exploration of current scientific topics. The main aim of the study is to determine characteristics of what kind of techniques to use in the Live Science Zone. The objectives are to explore what has already been done at At-Bristol, and what has been done at other science centres, and to identify successful devices. The secondary aim is mapping what sorts of topics that visitors are actually interested in debating. The methods used in the study are deep qualitative interviews with professionals working within the field of science communication in Europe and North America, and questionnaires answered by visitors to At-Bristol. The results show that there are some gaps between the intentions of the professionals and the opinions of the visitors, in terms of opportunities and willingness for dialogue in science centre activities. The most popular issue was Future and the most popular device was Film.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Most science centres in Canada employ science-educated floor staff to motivate visitorsto have fun while enhancing the educational reach of the exhibits. Although bright andsensitive to visitors’ needs, floor staff are rarely consulted in the planning,implementation, and modification phases of an exhibit. Instead, many developmentteams rely on costly third-party evaluations or skip the front-end and formativeevaluations all together, leading to costly errors that could have been avoided. This studywill seek to reveal a correlation between floor staff’s perception of visitors’ interactionswith an exhibit and visitors’ actual experiences. If a correlation exists, a recommendationcould be made to encourage planning teams to include floor staff in the formative andsummative evaluations of an exhibit. This is especially relevant to science centres withlimited budgets and for whom a divide exists between floor staff and management.In this study, a formative evaluation of one exhibit was conducted, measuring both floorstaff’s perceptions of the visitor experience and visitors’ own perceptions of the exhibit.Floor staff were then trained on visitor evaluation methods. A week later, floor staff andvisitors were surveyed a second time on a different exhibit to determine whether anincrease in accuracy existed.The training session increased the specificity of the motivation and comprehensionresponses and the enthusiasm of the staff, but not their ability to predict observedbehaviours with respect to ergonomics, learning indicators, holding power, and successrates. The results revealed that although floor staff underestimated visitors’ success ratesat the exhibits, staff accurately predicted visitors’ behaviours with respect to holdingpower, ergonomics, learning indicators, motivation and comprehension, both before andafter the staff training.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

O objetivo deste trabalho é identificar e descrever os processos de gerenciamento dos projetos de ajuste na estrutura organizacional, processos e governança de uma grande empresa brasileira. Os processos de gerenciamento de projetos são o encadeamento das ações e atividades que permitem atingir os objetivos do projeto, constituindo o objeto de estudo do Guia PMBOK e outras bases de conhecimento em gestão de projetos. A pesquisa identificou e descreveu o fluxo de gerenciamento dos projetos de gestão de uma das gerências da Empresa “X” a partir de análise documental, entrevistas e observação participativa. Os resultados foram analisados à luz do referencial teórico sobre processos de gerenciamento de projetos, processos de gerenciamento “front-end” e governança de projetos. A pesquisa concluiu que, comparado ao processo de gerenciamento previsto no Guia PMBOK, o gerenciamento dos projetos de gestão analisados coloca ênfase relativamente maior nos processos que antecedem o início formal do projeto (processos front-end), responsáveis por garantir seu alinhamento aos direcionadores e variáveis organizacionais.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

With the constant grow of enterprises and the need to share information across departments and business areas becomes more critical, companies are turning to integration to provide a method for interconnecting heterogeneous, distributed and autonomous systems. Whether the sales application needs to interface with the inventory application, the procurement application connect to an auction site, it seems that any application can be made better by integrating it with other applications. Integration between applications can face several troublesome due the fact that applications may not have been designed and implemented having integration in mind. Regarding to integration issues, two tier software systems, composed by the database tier and by the “front-end” tier (interface), have shown some limitations. As a solution to overcome the two tier limitations, three tier systems were proposed in the literature. Thus, by adding a middle-tier (referred as middleware) between the database tier and the “front-end” tier (or simply referred application), three main benefits emerge. The first benefit is related with the fact that the division of software systems in three tiers enables increased integration capabilities with other systems. The second benefit is related with the fact that any modifications to the individual tiers may be carried out without necessarily affecting the other tiers and integrated systems and the third benefit, consequence of the others, is related with less maintenance tasks in software system and in all integrated systems. Concerning software development in three tiers, this dissertation focus on two emerging technologies, Semantic Web and Service Oriented Architecture, combined with middleware. These two technologies blended with middleware, which resulted in the development of Swoat framework (Service and Semantic Web Oriented ArchiTecture), lead to the following four synergic advantages: (1) allow the creation of loosely-coupled systems, decoupling the database from “front-end” tiers, therefore reducing maintenance; (2) the database schema is transparent to “front-end” tiers which are aware of the information model (or domain model) that describes what data is accessible; (3) integration with other heterogeneous systems is allowed by providing services provided by the middleware; (4) the service request by the “frontend” tier focus on ‘what’ data and not on ‘where’ and ‘how’ related issues, reducing this way the application development time by developers.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper is based on the development and experimental analysis of a DCM Boost interleaved converter suitable for application in traction systems of electrical vehicles pulled by electrical motors (Trolleybus), which are powered by urban DC or AC distribution networks. This front-end structure is capable of providing significant improvements in trolleybuses systems and in the urban distribution network costs, and efficiency. The architecture of proposed converter is composed by five boost power cells in interleaving connection, operating in discontinuous conduction mode. Furthermore, the converter can operate as AC-DC converter, or as DC-DC converter providing the proper DC output voltage range required by DC or AC adjustable speed drivers. Therefore, when supplied by single-phase AC distribution networks, and operating as AC-DC converter, it is capable to provide high power factor, reduced harmonic distortion in the input current, complying with the restrictions imposed by the IEC 61000-3-4 standards. The digital controller has been implemented using a low cost FPGA and developed totally using a hardware description language VHDL and fixed point arithmetic. Thus, two control strategies are evaluated considering the compliance with input current restrictions imposed by IEC 61000-3-4 standards, the regular PWM modulation and a current correction PWM modulation. In order to verify the feasibility and performance of the proposed system, experimental results from a 15 kW low power scale prototype are presented, operating in DC and AC conditions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Web services are loosely coupled applications that use XML documents as a way of integrating distinct systems on the internet. Such documents are used by in standards such as SOAP, WSDL and UDDI which establish, respectively, integrated patterns for the representation of messages, description, and publication of services, thus facilitating the interoperability between heterogeneous systems. Often one single service does not meet the users needs, therefore new systems can be designed from the composition of two or more services. This which is the design goal behind the of the Service Oriented Architecture. Parallel to this scenario, we have the PEWS (Predicate Path-Expressions for Web Services) language, which speci es behavioural speci cations of composite web service interfaces.. The development of the PEWS language is divided into two parts: front-end and back-end. From a PEWS program, the front-end performs the lexical analysis, syntactic and semantic compositions and nally generate XML code. The function of the back-end is to execute the composition PEWS. This master's dissertation work aims to: (i) reformulate the proposed architecture for the runtime system of the language, (ii) Implement the back-end for PEWS by using .NET Framework tools to execute PEWS programs using the Windows Work ow Foundation