805 resultados para top-down analysis
Resumo:
La evolución de Internet al modelo Web 2.0, ha creado el nuevo sistema denominado Social Media, donde han proliferado un número ingente de redes sociales, que han cambiado las formas de relación y colaboración entre los usuarios, así como la relaciones de éstos y las empresas. En respuesta a estos dramáticos cambios sociales y tecnológicos, que actualmente están dando forma a las relaciones negocio-empresa, las empresas están descubriendo que es necesario modificar la estrategia de utilización del CRM (Customer Relationship Management) con sus clientes y desarrollar nuevas capacidades que permitan la creación de valor con los clientes. Y es aquí donde aparece el concepto de Social CRM, entendido como una estrategia centrada en entender, anticiparse y responder mejor a las necesidades de los clientes existentes o potenciales, aprovechando los datos sociales, para crear unas fuertes relaciones beneficiosas para ambas partes. En este trabajo se describe un modelo de adopción de Social CRM, aplicando un método de análisis “Top-Down”, y basado en el modelo de Gartner denominado “The Eight Building Blocks of CRM” [1]. El presente trabajo descompone el modelo de adopción descrito por Gartner, en los siguientes puntos. - Una decisión estratégica de la compañía - Asomarse a la realidad social - Analizar las redes sociales. - Metodología de adopción. - Despliegue y extensión en todos los departamentos de la compañía y la adaptación de los recursos humanos. - Selección e integración con las plataformas tradicionales de CRM - Análisis de herramientas de monitorización de Social CRM El modelo propuesto tiene dos objetivos, por un lado pretende proporcionar la visión de cómo CRM puede influir en los resultados empresariales en la era del cliente social, y por otro, proporcionar a los administradores cómo las inversiones y los recursos existentes de CRM puede ser integrados con las nuevas tecnologías y procesos para formar capacidades que pueden mejorar el rendimiento del negocio. ABSTRACT. “The Internet evolution to Web 2.0 model has created a new system called Social Media, where have proliferated a huge number of social networks which have changed the relationship and collaboration forms user-to-user and user-to-company. In response to these dramatic social and technological changes that are currently shaping the business-business relationships, companies are finding it necessary to modify the strategy for use of CRM (Customer Relationship Management) with customers and develop new capabilities to creating value with customers. And here is where the concept of Social CRM appears, understood as a focus on understanding, anticipating and responding to the needs of existing and potential customers strategy, leveraging social data to create a strong mutually beneficial relationships. In this paper describes an adoption model of Social CRM, using a "Top-Down" analysis method and based on the model of Gartner called "The Eight Building Blocks of CRM" [1]. This paper decomposes the adoption model described by Gartner in the following points. - A company strategic decision. - Look at social reality. - Analyze social networks. - Methodology adoption. - Deployment and extension in all departments of the company and the adaptation of human resources. - Selection and integration with traditional CRM platforms. - Analysis of monitoring tools for Social CRM. The proposed model has two objectives, firstly aims to provide insight into how CRM can influence business outcomes in the era of the social customer, and secondly, to provide administrators how investments and existing resources can be integrated CRM with new technologies and processes for developing capabilities that can increase business performance”.
Resumo:
New substation automation applications, such as sampled value process buses and synchrophasors, require sampling accuracy of 1 µs or better. The Precision Time Protocol (PTP), IEEE Std 1588, achieves this level of performance and integrates well into Ethernet based substation networks. This paper takes a systematic approach to the performance evaluation of commercially available PTP devices (grandmaster, slave, transparent and boundary clocks) from a variety of manufacturers. The ``error budget'' is set by the performance requirements of each application. The ``expenditure'' of this error budget by each component is valuable information for a system designer. The component information is used to design a synchronization system that meets the overall functional requirements. The quantitative performance data presented shows that this testing is effective and informative. Results from testing PTP performance in the presence of sampled value process bus traffic demonstrate the benefit of a ``bottom up'' component testing approach combined with ``top down'' system verification tests. A test method that uses a precision Ethernet capture card, rather than dedicated PTP test sets, to determine the Correction Field Error of transparent clocks is presented. This test is particularly relevant for highly loaded Ethernet networks with stringent timing requirements. The methods presented can be used for development purposes by manufacturers, or by system integrators for acceptance testing. A sampled value process bus was used as the test application for the systematic approach described in this paper. The test approach was applied, components were selected, and the system performance verified to meet the application's requirements. Systematic testing, as presented in this paper, is applicable to a range of industries that use, rather than develop, PTP for time transfer.
Resumo:
Flexible information exchange is critical to successful design-analysis integration, but current top-down, standards-based and model-oriented strategies impose restrictions that contradicts this flexibility. In this article we present a bottom-up, user-controlled and process-oriented approach to linking design and analysis applications that is more responsive to the varied needs of designers and design teams. Drawing on research into scientific workflows, we present a framework for integration that capitalises on advances in cloud computing to connect discrete tools via flexible and distributed process networks. We then discuss how a shared mapping process that is flexible and user friendly supports non-programmers in creating these custom connections. Adopting a services-oriented system architecture, we propose a web-based platform that enables data, semantics and models to be shared on the fly. We then discuss potential challenges and opportunities for its development as a flexible, visual, collaborative, scalable and open system.
Resumo:
Flexible information exchange is critical to successful design integration, but current top-down, standards-based and model-oriented strategies impose restrictions that are contradictory to this flexibility. In this paper we present a bottom-up, user-controlled and process-oriented approach to linking design and analysis applications that is more responsive to the varied needs of designers and design teams. Drawing on research into scientific workflows, we present a framework for integration that capitalises on advances in cloud computing to connect discrete tools via flexible and distributed process networks. Adopting a services-oriented system architecture, we propose a web-based platform that enables data, semantics and models to be shared on the fly. We discuss potential challenges and opportunities for the development thereof as a flexible, visual, collaborative, scalable and open system.
Resumo:
Recent developments in analytical technologies have driven significant advances in lipid science. The sensitivity and selectivity of modern mass spectrometers can now provide for the detection and even quantification of many hundreds of lipids in a single analysis. In parallel, increasing evidence from structural biology suggests that a detailed knowledge of lipid molecular structure including carbon-carbon double bond position, stereochemistry and acyl chain regiochemistry is required to fully appreciate the biochemical role(s) of individual lipids. Here we review the capabilities and limitations of tandem mass spectrometry to provide this level of structural specificity in the analysis of lipids present in complex biological extracts. In particular, we focus on the capabilities of a novel technology termed ozone-induced dissociation to identify the position (s) of double bonds in unsaturated lipids and discuss its possible role in efforts to develop workflows that provide for complete structure elucidation of lipids by mass spectrometry alone: so-called top-down lipidomics. This article is part of a Special Issue entitled: Lipodomics and Imaging Mass Spectrom. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
Among all methods of metal alloy slurry preparation, the cooling slope method is the simplest in terms of design and process control. The method involves pouring of the melt from top, down an oblique and channel shaped plate cooled from bottom by counter flowing water. The melt, while flowing down, partially solidifies and forms columnar dendrites on plate wall. These dendrites are broken into equiaxed grains and are washed away with melt. The melt, together with the equiaxed grains, forms semisolid slurry collected at the slope exit and cast into billets having non-dendritic microstructure. The final microstructure depends on several process parameters such as slope angle, slope length, pouring superheat, and cooling rate. The present work involves scaling analysis of conservation equations of momentum, energy and species for the melt flow down a cooling slope. The main purpose of the scaling analysis is to obtain a physical insight into the role and relative importance of each parameter in influencing the final microstructure. For assessing the scaling analysis, the trends predicted by scaling are compared against corresponding numerical results using an enthalpy based solidification model with incorporation of solid phase movement.
Resumo:
Almost 120 days at sea aboard three NOAA research vessels and one fishing vessel over the past three years have supported biogeographic characterization of Tortugas Ecological Reserve (TER). This work initiated measurement of post-implementation effects of TER as a refuge for exploited species. In Tortugas South, seafloor transect surveys were conducted using divers, towed operated vehicles (TOV), remotely operated vehicles (ROV), various sonar platforms, and the Deepworker manned submersible. ARGOS drifter releases, satellite imagery, ichthyoplankton surveys, sea surface temperature, and diver census were combined to elucidate potential dispersal of fish spawning in this environment. Surveys are being compiled into a GIS to allow resource managers to gauge benthic resource status and distribution. Drifter studies have determined that within the ~ 30 days of larval life stage for fishes spawning at Tortugas South, larvae could reach as far downstream as Tampa Bay on the west Florida coast and Cape Canaveral on the east coast. Together with actual fish surveys and water mass delineation, this work demonstrates that the refuge status of this area endows it with tremendous downstream spillover and larval export potential for Florida reef habitats and promotes the maintenance of their fish communities. In Tortugas North, 30 randomly selected, permanent stations were established. Five stations were assigned to each of the following six areas: within Dry Tortugas National Park, falling north of the prevailing currents (Park North); within Dry Tortugas National Park, falling south of the prevailing currents (Park South); within the Ecological Reserve falling north of the prevailing currents (Reserve North); within the Ecological Reserve falling south of the prevailing currents (Reserve South); within areas immediately adjacent to these two strata, falling north of the prevailing currents (Out North); and within areas immediately adjacent to these two strata, falling south of the prevailing currents (Out South). Intensive characterization of these sites was conducted using multiple sonar techniques, TOV, ROV, diver-based digital video collection, diver-based fish census, towed fish capture, sediment particle-size, benthic chlorophyll analyses, and stable isotope analyses of primary producers, fish, and, shellfish. In order to complement and extend information from studies focused on the coral reef, we have targeted the ecotone between the reef and adjacent, non-reef habitats as these areas are well-known in ecology for indicating changes in trophic relationships at the ecosystem scale. Such trophic changes are hypothesized to occur as top-down control of the system grows with protection of piscivorous fishes. Preliminary isotope data, in conjunction with our prior results from the west Florida shelf, suggest that the shallow water benthic habitats surrounding the coral reefs of TER will prove to be the source of a significant amount of the primary production ultimately fueling fish production throughout TER and downstream throughout the range of larval fish dispersal. Therefore, the status and influence of the previously neglected, non-reef habitat within the refuge (comprising ~70% of TER) appears to be intimately tied to the health of the coral reef community proper. These data, collected in a biogeographic context, employing an integrated Before-After Control Impact design at multiple spatial scales, leave us poised to document and quantify the postimplementation effects of TER. Combined with the work at Tortugas South, this project represents a multi-disciplinary effort of sometimes disparate disciplines (fishery oceanography, benthic ecology, food web analysis, remote sensing/geography/landscape ecology, and resource management) and approaches (physical, biological, ecological). We expect the continuation of this effort to yield critical information for the management of TER and the evaluation of protected areas as a refuge for exploited species. (PDF contains 32 pages.)
Resumo:
BACKGROUND: The utilisation of good design practices in the development of complex health services is essential to improving quality. Healthcare organisations, however, are often seriously out of step with modern design thinking and practice. As a starting point to encourage the uptake of good design practices, it is important to understand the context of their intended use. This study aims to do that by articulating current health service development practices. METHODS: Eleven service development projects carried out in a large mental health service were investigated through in-depth interviews with six operation managers. The critical decision method in conjunction with diagrammatic elicitation was used to capture descriptions of these projects. Stage-gate design models were then formed to visually articulate, classify and characterise different service development practices. RESULTS: Projects were grouped into three categories according to design process patterns: new service introduction and service integration; service improvement; service closure. Three common design stages: problem exploration, idea generation and solution evaluation - were then compared across the design process patterns. Consistent across projects were a top-down, policy-driven approach to exploration, underexploited idea generation and implementation-based evaluation. CONCLUSIONS: This study provides insight into where and how good design practices can contribute to the improvement of current service development practices. Specifically, the following suggestions for future service development practices are made: genuine user needs analysis for exploration; divergent thinking and innovative culture for idea generation; and fail-safe evaluation prior to implementation. Better training for managers through partnership working with design experts and researchers could be beneficial.
Resumo:
Multiple sound sources often contain harmonics that overlap and may be degraded by environmental noise. The auditory system is capable of teasing apart these sources into distinct mental objects, or streams. Such an "auditory scene analysis" enables the brain to solve the cocktail party problem. A neural network model of auditory scene analysis, called the AIRSTREAM model, is presented to propose how the brain accomplishes this feat. The model clarifies how the frequency components that correspond to a give acoustic source may be coherently grouped together into distinct streams based on pitch and spatial cues. The model also clarifies how multiple streams may be distinguishes and seperated by the brain. Streams are formed as spectral-pitch resonances that emerge through feedback interactions between frequency-specific spectral representaion of a sound source and its pitch. First, the model transforms a sound into a spatial pattern of frequency-specific activation across a spectral stream layer. The sound has multiple parallel representations at this layer. A sound's spectral representation activates a bottom-up filter that is sensitive to harmonics of the sound's pitch. The filter activates a pitch category which, in turn, activate a top-down expectation that allows one voice or instrument to be tracked through a noisy multiple source environment. Spectral components are suppressed if they do not match harmonics of the top-down expectation that is read-out by the selected pitch, thereby allowing another stream to capture these components, as in the "old-plus-new-heuristic" of Bregman. Multiple simultaneously occuring spectral-pitch resonances can hereby emerge. These resonance and matching mechanisms are specialized versions of Adaptive Resonance Theory, or ART, which clarifies how pitch representations can self-organize durin learning of harmonic bottom-up filters and top-down expectations. The model also clarifies how spatial location cues can help to disambiguate two sources with similar spectral cures. Data are simulated from psychophysical grouping experiments, such as how a tone sweeping upwards in frequency creates a bounce percept by grouping with a downward sweeping tone due to proximity in frequency, even if noise replaces the tones at their interection point. Illusory auditory percepts are also simulated, such as the auditory continuity illusion of a tone continuing through a noise burst even if the tone is not present during the noise, and the scale illusion of Deutsch whereby downward and upward scales presented alternately to the two ears are regrouped based on frequency proximity, leading to a bounce percept. Since related sorts of resonances have been used to quantitatively simulate psychophysical data about speech perception, the model strengthens the hypothesis the ART-like mechanisms are used at multiple levels of the auditory system. Proposals for developing the model to explain more complex streaming data are also provided.
Resumo:
Although Common Pool Resources (CPRs) make up a significant share of total income for rural households in Ethiopia and elsewhere in developing world, limited access to these resources and environmental degradation threaten local livelihoods. As a result, the issues of management, governance of CPRs and how to prevent their over-exploitation are of great importance for development policy. This study examines the current state and dynamics of CPRs and overall resource governance system of the Lake Tana sub-basin. This research employed the modified form of Institutional Analysis and Development (IAD) framework. The framework integrates the concept of Socio-Ecological Systems (SES) and Interactive Governance (IG) perspectives where social actors, institutions, the politico-economic context, discourses and ecological features across governance and government levels were considered. It has been observed that overexploitation, degradation and encroachment of CPRs have increased dramatically and this threatens the sustainability of Lake Tana ecosystem. The stakeholder analysis result reveals that there are multiple stakeholders with diverse interest in and power over CPRs. The analysis of institutional arrangements reveals that the existing formal rules and regulations governing access to and control over CPRs could not be implemented and were not effective to legally bind and govern CPR user’s behavior at the operational level. The study also shows that a top-down and non-participatory policy formulation, law and decision making process overlooks the local contexts (local knowledge and informal institutions). The outcomes of examining the participation of local resource users, as an alternative to a centralized, command-and-control, and hierarchical approach to resource management and governance, have called for a fundamental shift in CPR use, management and governance to facilitate the participation of stakeholders in decision making. Therefore, establishing a multi-level stakeholder governance system as an institutional structure and process is necessary to sustain stakeholder participation in decision-making regarding CPR use, management and governance.
Resumo:
Background: Elective repeat caesarean delivery (ERCD) rates have been increasing worldwide, thus prompting obstetric discourse on the risks and benefits for the mother and infant. Yet, these increasing rates also have major economic implications for the health care system. Given the dearth of information on the cost-effectiveness related to mode of delivery, the aim of this paper was to perform an economic evaluation on the costs and short-term maternal health consequences associated with a trial of labour after one previous caesarean delivery compared with ERCD for low risk women in Ireland.Methods: Using a decision analytic model, a cost-effectiveness analysis (CEA) was performed where the measure of health gain was quality-adjusted life years (QALYs) over a six-week time horizon. A review of international literature was conducted to derive representative estimates of adverse maternal health outcomes following a trial of labour after caesarean (TOLAC) and ERCD. Delivery/procedure costs derived from primary data collection and combined both "bottom-up" and "top-down" costing estimations.Results: Maternal morbidities emerged in twice as many cases in the TOLAC group than the ERCD group. However, a TOLAC was found to be the most-effective method of delivery because it was substantially less expensive than ERCD ((sic)1,835.06 versus (sic)4,039.87 per women, respectively), and QALYs were modestly higher (0.84 versus 0.70). Our findings were supported by probabilistic sensitivity analysis.Conclusions: Clinicians need to be well informed of the benefits and risks of TOLAC among low risk women. Ideally, clinician-patient discourse would address differences in length of hospital stay and postpartum recovery time. While it is premature advocate a policy of TOLAC across maternity units, the results of the study prompt further analysis and repeat iterations, encouraging future studies to synthesis previous research and new and relevant evidence under a single comprehensive decision model.
Resumo:
This thesis analyses how dominant policy approaches to peacebuilding have moved away from a single and universalised understanding of peace to be achieved through a top-down strategy of democratisation and economic liberalisation, prevalent at the beginning of 1990s. Instead, throughout the 2000s, peacebuilders have increasingly adopted a commitment to cultivating a bottom-up and hybrid peace building process that is context-sensitive and intended to be more respectful of the needs and values of post-war societies. The projects of statebuilding in Kosovo and, to a lesser extent, in Bosnia are examined to illustrate the shift. By capturing this shift, I seek to argue that contemporary practitioners of peace are sharing the sensibility of the theoretical critics of liberalism. These critics have long contended that post-war societies cannot be governed from ‘above’ and have advocated the adoption of a bottom-up approach to peacebuilding. Now, both peace practitioners and their critics share the tendency to embrace difference in peacebuilding operations, but this shift has failed to address meaningfully the problems and concerns of post-conflict societies. The conclusion of this research is that, drawing on the assumption that these societies are not capable of undertaking sovereign acts because of their problematic inter-subjective frames, the discourses of peacebuilding (in policy-making and academic critique) have increasingly legitimised an open-ended role of interference by external agencies, which now operate from ‘below’. Peacebuilding has turned into a long-term process, in which international and local actors engage relationally in the search for ever-more emancipatory hybrid outcomes, but in which self-government and self-determination are constantly deferred. Processes of emphasising difference have thus denied the political autonomy of post-war societies and have continuously questioned the political and human equality of these populations in a hierarchically divided world.
Resumo:
Ontologies have been established for knowledge sharing and are widely used as a means for conceptually structuring domains of interest. With the growing usage of ontologies, the problem of overlapping knowledge in a common domain becomes critical. In this short paper, we address two methods for merging ontologies based on Formal Concept Analysis: FCA-Merge and ONTEX. --- FCA-Merge is a method for merging ontologies following a bottom-up approach which offers a structural description of the merging process. The method is guided by application-specific instances of the given source ontologies. We apply techniques from natural language processing and formal concept analysis to derive a lattice of concepts as a structural result of FCA-Merge. The generated result is then explored and transformed into the merged ontology with human interaction. --- ONTEX is a method for systematically structuring the top-down level of ontologies. It is based on an interactive, top-down- knowledge acquisition process, which assures that the knowledge engineer considers all possible cases while avoiding redundant acquisition. The method is suited especially for creating/merging the top part(s) of the ontologies, where high accuracy is required, and for supporting the merging of two (or more) ontologies on that level.
Resumo:
Using the case of an economically declined neighbourhood in the post-industrial German Ruhr Area (sometimes characterized as Germany’s “Rust Belt”), we analyse, describe and conclude how urban agriculture can be used as a catalyst to stimulate and support urban renewal and regeneration, especially from a socio-cultural perspective. Using the methodological framework of participatory action research, and linking bottom-up and top-down planning approaches, a project path was developed to include the population affected and foster individual responsibility for their district, as well as to strengthen inhabitants and stakeholder groups in a permanent collective stewardship for the individual forms of urban agriculture developed and implemented. On a more abstract level, the research carried out can be characterized as a form of action research with an intended transgression of the boundaries between research, planning, design, and implementation. We conclude that by synchronously combining those four domains with intense feedback loops, synergies for the academic knowledge on the potential performance of urban agriculture in terms of sustainable development, as well as the benefits for the case-study area and the interests of individual urban gardeners can be achieved.