970 resultados para Front end Developer


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Con el siguiente proyecto se pretende explicar cómo se realiza la integración de las técnicas de mercadeo y la relación estratégica comunitaria, debido a que las organizaciones utilizan conceptos comunitarios. Se analizan las principales estrategias de mercadeo como marketing mix, geomarketing, mercadeo de servicios, mercadeo relacional y mercadeo social. Se explican las técnicas de mercadeo como mercadeo directo, diferenciación de productos, segmentación de mercado, investigación de mercados, inteligencia de mercados, optimización de canales de distribución y comercio electrónico. Adicionalmente, se exponen las estrategias comunitarias como coaliciones comunitarias, organizaciones de base, liderazgo comunitario y empoderamiento. La metodología implementada para este proyecto es de tipo teórico-conceptual y reúne los aportes de varios documentos científicos de diversas áreas del conocimiento. Las fuentes de información, conceptos y teorías se seleccionan según el criterio del investigador en función de las posibilidades descriptivas de la integración propuesta. En esta investigación se concluye que las técnicas y las estrategias de mercadeo permiten la comunicación entre las organizaciones y las comunidades. Esto posibilita que exista participación entre ambas partes y es un factor clave para el surgimiento de la relación estratégica comunitaria. Se recomienda realizar investigaciones posteriores sobre la relación estratégica comunitaria, aplicadas a organizaciones y comunidades.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

La creciente dinamización de las IDE's genera una demanda de la construcción de Geoportales y por ende la demanda de herramientas que además de facilitar su construcción, configuración e implementación, ofrezcan la posibilidad de contratar un soporte técnico profesionalizado. OpenGeo Suite, paquete de software libre profesional e integrado, que permite desde el almacenamiento de datos geográficos, hasta su publicación utilizando estándares OGC e implementación de soluciones web GIS con librerías de código abierto Javascript. OpenGeo Suite permite un despliegue multiplataforma (Linux, Windows y OSX), con cuatro componentes de software libre fuertemente integrados basados en el uso de estándares OGC. Los componentes del lado del servidor están orientados al almacenamiento, configuración y publicación de datos por parte de usuarios técnicos en SIG: PostgreSQL+ la extensión espacial PostGIS que se encarga del almacenamiento de la información geográfica dando soporte a funciones de análisis espacial. pgAdmin como sistema de gestión de base de datos, facilitando la importación y actualización de datos. Geoserver se encarga de la publicación de la información geográfica proveniente de diferentes orígenes de datos: PostGIS, SHP, Oracle Spatial, GeoTIFF, etc. soportando la mayoría de estándares OGC de publicación de información geográfica WMS, WFS, WCS y de formatos GML, KML, GeoJSON, SLD. Además, ofrece soporte a cacheado de teselas a través de Geowebcache. OpenGeo Suite ofrece dos aplicaciones: GeoExplorer y GeoEditor, que permiten al técnico construir un Geoportal con capacidades de edición de geometrías.OpenGeo Suite ofrece una consola de administración (Dashboard) que facilita la configuración de los componentes de administración. Del lado del cliente, los componentes son librerías de desarrollo JavaScript orientadas a desarrolladores de aplicaciones Web SIG. OpenLayers con soporte para capas raster, vectoriales, estilos, proyecciones, teselado, herramientas de edición, etc. Por último, GeoExt para la construcción del front-end de Geoportales, basada en ExtJS y fuertemente acoplada a OpenLayers

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Compute grids are used widely in many areas of environmental science, but there has been limited uptake of grid computing by the climate modelling community, partly because the characteristics of many climate models make them difficult to use with popular grid middleware systems. In particular, climate models usually produce large volumes of output data, and running them usually involves complicated workflows implemented as shell scripts. For example, NEMO (Smith et al. 2008) is a state-of-the-art ocean model that is used currently for operational ocean forecasting in France, and will soon be used in the UK for both ocean forecasting and climate modelling. On a typical modern cluster, a particular one year global ocean simulation at 1-degree resolution takes about three hours when running on 40 processors, and produces roughly 20 GB of output as 50000 separate files. 50-year simulations are common, during which the model is resubmitted as a new job after each year. Running NEMO relies on a set of complicated shell scripts and command utilities for data pre-processing and post-processing prior to job resubmission. Grid Remote Execution (G-Rex) is a pure Java grid middleware system that allows scientific applications to be deployed as Web services on remote computer systems, and then launched and controlled as if they are running on the user's own computer. Although G-Rex is general purpose middleware it has two key features that make it particularly suitable for remote execution of climate models: (1) Output from the model is transferred back to the user while the run is in progress to prevent it from accumulating on the remote system and to allow the user to monitor the model; (2) The client component is a command-line program that can easily be incorporated into existing model work-flow scripts. G-Rex has a REST (Fielding, 2000) architectural style, which allows client programs to be very simple and lightweight and allows users to interact with model runs using only a basic HTTP client (such as a Web browser or the curl utility) if they wish. This design also allows for new client interfaces to be developed in other programming languages with relatively little effort. The G-Rex server is a standard Web application that runs inside a servlet container such as Apache Tomcat and is therefore easy to install and maintain by system administrators. G-Rex is employed as the middleware for the NERC1 Cluster Grid, a small grid of HPC2 clusters belonging to collaborating NERC research institutes. Currently the NEMO (Smith et al. 2008) and POLCOMS (Holt et al, 2008) ocean models are installed, and there are plans to install the Hadley Centre’s HadCM3 model for use in the decadal climate prediction project GCEP (Haines et al., 2008). The science projects involving NEMO on the Grid have a particular focus on data assimilation (Smith et al. 2008), a technique that involves constraining model simulations with observations. The POLCOMS model will play an important part in the GCOMS project (Holt et al, 2008), which aims to simulate the world’s coastal oceans. A typical use of G-Rex by a scientist to run a climate model on the NERC Cluster Grid proceeds as follows :(1) The scientist prepares input files on his or her local machine. (2) Using information provided by the Grid’s Ganglia3 monitoring system, the scientist selects an appropriate compute resource. (3) The scientist runs the relevant workflow script on his or her local machine. This is unmodified except that calls to run the model (e.g. with “mpirun”) are simply replaced with calls to "GRexRun" (4) The G-Rex middleware automatically handles the uploading of input files to the remote resource, and the downloading of output files back to the user, including their deletion from the remote system, during the run. (5) The scientist monitors the output files, using familiar analysis and visualization tools on his or her own local machine. G-Rex is well suited to climate modelling because it addresses many of the middleware usability issues that have led to limited uptake of grid computing by climate scientists. It is a lightweight, low-impact and easy-to-install solution that is currently designed for use in relatively small grids such as the NERC Cluster Grid. A current topic of research is the use of G-Rex as an easy-to-use front-end to larger-scale Grid resources such as the UK National Grid service.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper we present an architecture for network and applications management, which is based on the Active Networks paradigm and shows the advantages of network programmability. The stimulus to develop this architecture arises from an actual need to manage a cluster of active nodes, where it is often required to redeploy network assets and modify nodes connectivity. In our architecture, a remote front-end of the managing entity allows the operator to design new network topologies, to check the status of the nodes and to configure them. Moreover, the proposed framework allows to explore an active network, to monitor the active applications, to query each node and to install programmable traps. In order to take advantage of the Active Networks technology, we introduce active SNMP-like MIBs and agents, which are dynamic and programmable. The programmable management agents make tracing distributed applications a feasible task. We propose a general framework that can inter-operate with any active execution environment. In this framework, both the manager and the monitor front-ends communicate with an active node (the Active Network Access Point) through the XML language. A gateway service performs the translation of the queries from XML to an active packet language and injects the code in the network. We demonstrate the implementation of an active network gateway for PLAN (Packet Language for Active Networks) in a forty active nodes testbed. Finally, we discuss an application of the active management architecture to detect the causes of network failures by tracing network events in time.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The complexity of construction projects and the fragmentation of the construction industry undertaking those projects has effectively resulted in linear, uncoordinated and highly variable project processes in the UK construction sector. Research undertaken at the University of Salford resulted in the development of an improved project process, the Process Protocol, which considers the whole lifecycle of a construction project whilst integrating its participants under a common framework. The Process Protocol identifies the various phases of a construction project with particular emphasis on what is described in the manufacturing industry as the ‘fuzzy front end’. The participants in the process are described in terms of the activities that need to be undertaken in order to achieve a successful project and process execution. In addition, the decision-making mechanisms, from a client perspective, are illustrated and the foundations for a learning organization/industry are facilitated within a consistent Process Protocol.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Letter identification is a critical front end of the reading process. In general, conceptualizations of the identification process have emphasized arbitrary sets of distinctive features. However, a richer view of letter processing incorporates principles from the field of type design, including an emphasis on uniformities across letters within a font. The importance of uniformities is supported by a small body of research indicating that consistency of font increases letter identification efficiency. We review design concepts and the relevant literature, with the goal of stimulating further thinking about letter processing during reading.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The construction industry is widely recognised as being inherent with risk and uncertainty. This necessitates the need for effective project risk management to achieve the project objectives of time, cost and quality. A popular tool employed in projects to aid in the management of risk is a risk register. This tool documents the project risks and is often employed by the Project Manager (PM) to manage the associated risks on a project. This research aims to ascertain how widely risk registers are used by Project Managers as part of their risk management practices. To achieve this aim entailed interviewing ten PMs, to discuss their use of the risk register as a risk management tool. The results from these interviews indicated the prevalent use of this document and recognised its effectiveness in the management of project risks. The findings identified the front end and feasibility phases of a project as crucial stages for using risk registers, noting it as a vital ingredient in the risk response planning of the decision making process. Moreover, the composition of the risk register was also understood, with an insight into how PMs produce and develop this tool also ascertained. In conclusion, this research signifies the extensive use of the risk register by PMs. A majority of PMs were of the view that risk registers constitute an essential component of their project risk management practices. This suggests a need for further research on the extent to which risk registers actually help PMs to control the risks in a construction project, particularly residual risks, and how this can be improved to minimize deviations from expected outcomes.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Main Injector Neutrino Oscillation Search (MINOS) experiment uses an accelerator-produced neutrino beam to perform precision measurements of the neutrino oscillation parameters in the ""atmospheric neutrino"" sector associated with muon neutrino disappearance. This long-baseline experiment measures neutrino interactions in Fermilab`s NuMI neutrino beam with a near detector at Fermilab and again 735 km downstream with a far detector in the Soudan Underground Laboratory in northern Minnesota. The two detectors are magnetized steel-scintillator tracking calorimeters. They are designed to be as similar as possible in order to ensure that differences in detector response have minimal impact on the comparisons of event rates, energy spectra and topologies that are essential to MINOS measurements of oscillation parameters. The design, construction, calibration and performance of the far and near detectors are described in this paper. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The H.R. MacMillan Space Centre is a multi-faceted organization whose mission is to educate, inspire and evoke a sense of wonder about the universe, our planet and space exploration. As a popular, Vancouver science centre, it faces the same range of challenges and issues as other major attractions: how does the Space Centre maintain a healthy public attendance in an increasingly competitive market where visitors continue to be presented with an increasingly rich range of choices for their leisure spending and entertainment dollars?This front-end study investigated visitor attitudes, thoughts and preconceptions on the topic of space and astronomy. It also examined visitors’ motivations for coming to a space science centre. Useful insights were obtained which will be applied to improve future programme content and exhibit development.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The current paper presents a study conducted at At-Bristol Science Centre, UK. It is a front-end evaluation for the “Live Science Zone” at At-Bristol, which will be built during the autumn of 2004. It will provide a facility for programmed events and shows, non-programmed investigative activities and the choice of passive or active exploration of current scientific topics. The main aim of the study is to determine characteristics of what kind of techniques to use in the Live Science Zone. The objectives are to explore what has already been done at At-Bristol, and what has been done at other science centres, and to identify successful devices. The secondary aim is mapping what sorts of topics that visitors are actually interested in debating. The methods used in the study are deep qualitative interviews with professionals working within the field of science communication in Europe and North America, and questionnaires answered by visitors to At-Bristol. The results show that there are some gaps between the intentions of the professionals and the opinions of the visitors, in terms of opportunities and willingness for dialogue in science centre activities. The most popular issue was Future and the most popular device was Film.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Most science centres in Canada employ science-educated floor staff to motivate visitorsto have fun while enhancing the educational reach of the exhibits. Although bright andsensitive to visitors’ needs, floor staff are rarely consulted in the planning,implementation, and modification phases of an exhibit. Instead, many developmentteams rely on costly third-party evaluations or skip the front-end and formativeevaluations all together, leading to costly errors that could have been avoided. This studywill seek to reveal a correlation between floor staff’s perception of visitors’ interactionswith an exhibit and visitors’ actual experiences. If a correlation exists, a recommendationcould be made to encourage planning teams to include floor staff in the formative andsummative evaluations of an exhibit. This is especially relevant to science centres withlimited budgets and for whom a divide exists between floor staff and management.In this study, a formative evaluation of one exhibit was conducted, measuring both floorstaff’s perceptions of the visitor experience and visitors’ own perceptions of the exhibit.Floor staff were then trained on visitor evaluation methods. A week later, floor staff andvisitors were surveyed a second time on a different exhibit to determine whether anincrease in accuracy existed.The training session increased the specificity of the motivation and comprehensionresponses and the enthusiasm of the staff, but not their ability to predict observedbehaviours with respect to ergonomics, learning indicators, holding power, and successrates. The results revealed that although floor staff underestimated visitors’ success ratesat the exhibits, staff accurately predicted visitors’ behaviours with respect to holdingpower, ergonomics, learning indicators, motivation and comprehension, both before andafter the staff training.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

For an organisation to undertake a Customer Relationship Management (CRM) implementation program it needs to consider a multitude of requirements. Some authors have hinted at viewing CRM within a wider perspective than just  customer centric perspectives. The aim of this paper is to discuss the domain and conceptualise some of the requirements of CRM from an organisation’s point of view. However, CRM needs to be identified as the whole organisation, including its internal and external environments. Undertaking CRM in any organisation needs to be preceded by a sequence of stages. An organization needs to develop a roadmap outlining the path to become CRM centric. Therefore an organisation should address, or at least consider, a list of those factors at every stage of a CRM implementation program, for an implementation program to be effective. The main focus of literature in CRM has been customer centric. This paper, being the first stage of much wider research, will focus on the organisation and the internal environment. This paper will identify three information systems (IS) and information technology (IT) requirements in organisations that are integral parts of CRM. These need to achieve a level of synergy for successful CRM. To understand these three requirements (front-end systems, back-end systems, and datahandling technologies) in a CRM project is too great in magnitude at this early stage of the research. This paper begins to draw together the tenuous links between the three requirements of information systems (IS) and information technology (IT) systems. Writing this paper and shifting its focus towards requirements engineering, the author has realised that a whole area of literature has to be explored, because CRM is another IS implementation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The overarching goal of this dissertation was to evaluate the contextual components of instructional strategies for the acquisition of complex programming concepts. A meta-knowledge processing model is proposed, on the basis of the research findings, thereby facilitating the selection of media treatment for electronic courseware. When implemented, this model extends the work of Smith (1998), as a front-end methodology, for his glass-box interpreter called Bradman, for teaching novice programmers. Technology now provides the means to produce individualized instructional packages with relative ease. Multimedia and Web courseware development accentuate a highly graphical (or visual) approach to instructional formats. Typically, little consideration is given to the effectiveness of screen-based visual stimuli, and curiously, students are expected to be visually literate, despite the complexity of human-computer interaction. Visual literacy is much harder for some people to acquire than for others! (see Chapter Four: Conditions-of-the-Learner) An innovative research programme was devised to investigate the interactive effect of instructional strategies, enhanced with text-plus-textual metaphors or text-plus-graphical metaphors, and cognitive style, on the acquisition of a special category of abstract (process) programming concept. This type of concept was chosen to focus on the role of analogic knowledge involved in computer programming. The results are discussed within the context of the internal/external exchange process, drawing on Ritchey's (1980) concepts of within-item and between-item encoding elaborations. The methodology developed for the doctoral project integrates earlier research knowledge in a novel, interdisciplinary, conceptual framework, including: from instructional science in the USA, for the concept learning models; British cognitive psychology and human memory research, for defining the cognitive style construct; and Australian educational research, to provide the measurement tools for instructional outcomes. The experimental design consisted of a screening test to determine cognitive style, a pretest to determine prior domain knowledge in abstract programming knowledge elements, the instruction period, and a post-test to measure improved performance. This research design provides a three-level discovery process to articulate: 1) the fusion of strategic knowledge required by the novice learner for dealing with contexts within instructional strategies 2) acquisition of knowledge using measurable instructional outcome and learner characteristics 3) knowledge of the innate environmental factors which influence the instructional outcomes This research has successfully identified the interactive effect of instructional strategy, within an individual's cognitive style construct, in their acquisition of complex programming concepts. However, the significance of the three-level discovery process lies in the scope of the methodology to inform the design of a meta-knowledge processing model for instructional science. Firstly, the British cognitive style testing procedure, is a low cost, user friendly, computer application that effectively measures an individual's position on the two cognitive style continua (Riding & Cheema,1991). Secondly, the QUEST Interactive Test Analysis System (Izard,1995), allows for a probabilistic determination of an individual's knowledge level, relative to other participants, and relative to test-item difficulties. Test-items can be related to skill levels, and consequently, can be used by instructional scientists to measure knowledge acquisition. Finally, an Effect Size Analysis (Cohen,1977) allows for a direct comparison between treatment groups, giving a statistical measurement of how large an effect the independent variables have on the dependent outcomes. Combined with QUEST's hierarchical positioning of participants, this tool can assist in identifying preferred learning conditions for the evaluation of treatment groups. By combining these three assessment analysis tools into instructional research, a computerized learning shell, customised for individuals' cognitive constructs can be created (McKay & Garner,1999). While this approach has widespread application, individual researchers/trainers would nonetheless, need to validate with an extensive pilot study programme (McKay,1999a; McKay,1999b), the interactive effects within their specific learning domain. Furthermore, the instructional material does not need to be limited to a textual/graphical comparison, but could be applied to any two or more instructional treatments of any kind. For instance: a structured versus exploratory strategy. The possibilities and combinations are believed to be endless, provided the focus is maintained on linking of the front-end identification of cognitive style with an improved performance outcome. My in-depth analysis provides a better understanding of the interactive effects of the cognitive style construct and instructional format on the acquisition of abstract concepts, involving spatial relations and logical reasoning. In providing the basis for a meta-knowledge processing model, this research is expected to be of interest to educators, cognitive psychologists, communications engineers and computer scientists specialising in computer-human interactions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Distributed denial of service (DDoS) attack is a continuous critical threat to the Internet. Derived from the low layers, new application-layer-based DDoS attacks utilizing legitimate HTTP requests to overwhelm victim resources are more undetectable. The case may be more serious when suchattacks mimic or occur during the flash crowd event of a popular Website. In this paper, we present the design and implementation of CALD, an architectural extension to protect Web servers against various DDoS attacks that masquerade as flash crowds. CALD provides real-time detection using mess tests but is different from other systems that use resembling methods. First, CALD uses a front-end sensor to monitor thetraffic that may contain various DDoS attacks or flash crowds. Intense pulse in the traffic means possible existence of anomalies because this is the basic property of DDoS attacks and flash crowds. Once abnormal traffic is identified, the sensor sends ATTENTION signal to activate the attack detection module. Second, CALD dynamically records the average frequency of each source IP and check the total mess extent. Theoretically, the mess extent of DDoS attacks is larger than the one of flash crowds. Thus, with some parameters from the attack detection module, the filter is capable of letting the legitimate requests through but the attack traffic stopped. Third, CALD may divide the security modules away from the Web servers. As a result, it keeps maximum performance on the kernel web services, regardless of the harassment from DDoS. In the experiments, the records from www.sina.com and www.taobao.com have proved the value of CALD.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper focuses on the assessment of student teachers during practicum. The study is contextualised in an Australian pre-service teacher education program in which practicum has been reconceptualised to help bridge the theory–practice gap commonly associated with “front-end loading” programs. Survey and interview data collected from student teachers and supervising teachers point to what participants perceive as disparate understandings between university and school staff about the nature and role of assessment and suggest that this lack of common understanding adversely affects students’ experiences of assessment.