939 resultados para Multi-source case studies


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Vibration based damage identification methods examine the changes in primary modal parameters or quantities derived from modal parameters. As one method may have advantages over the other under some circumstances, a multi-criteria approach is proposed. Case studies are conducted separately on beam, plate and plate-on-beam structures. Using the numerically simulated modal data obtained through finite element analysis software, algorithms based on flexibility and strain energy changes before and after damage are obtained and used as the indices for the assessment of the state of structural health. Results show that the proposed multi-criteria method is effective in damage identification in these structures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This report presents learnings, case studies, guidelines and resources for non-government organisations that are planning to implement shared or collaborative arrangements with other agencies. It summarises results from an evaluation of the implementation phase of the Multi-Tenant Service Centre (MTSC) Pilots Project, which was completed in June 2008. This evaluation shows that developing and implementing shared and collaborative arrangements is a complex process that presents many risks, challenges and barriers to success, but can have many potential benefits for non government organisations. As this report makes clear, there is no ‘one size fits all’ approach to this process. The MTSC Pilots Project was conducted by the Department of Communities (DoC), Queensland Government, as part of its Strengthening Non-Government Organisations strategy. The objective of the MTSC Pilots initiative was to co-locate separate service providers in an appropriately located centre, operating with effective and transparent management, which enabled service providers to improve client services. Three MTSC consortiums in Mackay, Caboolture and Toowoomba were selected as the pilots over a four year period from 2006 – 2010.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A shortage of medical personnel has become a critical problem for developing countries attempting to expand the provision of medical services for the poor. In order to highlight the driving forces determining the international allocation of medical personnel, the cases of four countries, namely the Philippines and South Africa as source countries and Saudi Arabia and the United Kingdom as destination countries, are examined. The paper concludes that changes in demand generated in major destination countries determine the international allocation of medical personnel at least in the short run. Major destination countries often alter their policies on how many medical staff they can accept, and from where, while source countries are required to make appropriate responses to the changes in demand.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Based on the empirical evidence that the ratio of email messages in public mailing lists to versioning system commits has remained relatively constant along the history of the Apache Software Foundation (ASF), this paper has as goal to study what can be inferred from such a metric for projects of the ASF. We have found that the metric seems to be an intensive metric as it is independent of the size of the project, its activity, or the number of developers, and remains relatively independent of the technology or functional area of the project. Our analysis provides evidence that the metric is related to the technical effervescence and popularity of project, and as such can be a good candidate to measure its healthy evolution. Other, similar metrics -like the ratio of developer messages to commits and the ratio of issue tracker messages to commits- are studied for several projects as well, in order to see if they have similar characteristics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El uso de aritmética de punto fijo es una opción de diseño muy extendida en sistemas con fuertes restricciones de área, consumo o rendimiento. Para producir implementaciones donde los costes se minimicen sin impactar negativamente en la precisión de los resultados debemos llevar a cabo una asignación cuidadosa de anchuras de palabra. Encontrar la combinación óptima de anchuras de palabra en coma fija para un sistema dado es un problema combinatorio NP-hard al que los diseñadores dedican entre el 25 y el 50 % del ciclo de diseño. Las plataformas hardware reconfigurables, como son las FPGAs, también se benefician de las ventajas que ofrece la aritmética de coma fija, ya que éstas compensan las frecuencias de reloj más bajas y el uso más ineficiente del hardware que hacen estas plataformas respecto a los ASICs. A medida que las FPGAs se popularizan para su uso en computación científica los diseños aumentan de tamaño y complejidad hasta llegar al punto en que no pueden ser manejados eficientemente por las técnicas actuales de modelado de señal y ruido de cuantificación y de optimización de anchura de palabra. En esta Tesis Doctoral exploramos distintos aspectos del problema de la cuantificación y presentamos nuevas metodologías para cada uno de ellos: Las técnicas basadas en extensiones de intervalos han permitido obtener modelos de propagación de señal y ruido de cuantificación muy precisos en sistemas con operaciones no lineales. Nosotros llevamos esta aproximación un paso más allá introduciendo elementos de Multi-Element Generalized Polynomial Chaos (ME-gPC) y combinándolos con una técnica moderna basada en Modified Affine Arithmetic (MAA) estadístico para así modelar sistemas que contienen estructuras de control de flujo. Nuestra metodología genera los distintos caminos de ejecución automáticamente, determina las regiones del dominio de entrada que ejercitarán cada uno de ellos y extrae los momentos estadísticos del sistema a partir de dichas soluciones parciales. Utilizamos esta técnica para estimar tanto el rango dinámico como el ruido de redondeo en sistemas con las ya mencionadas estructuras de control de flujo y mostramos la precisión de nuestra aproximación, que en determinados casos de uso con operadores no lineales llega a tener tan solo una desviación del 0.04% con respecto a los valores de referencia obtenidos mediante simulación. Un inconveniente conocido de las técnicas basadas en extensiones de intervalos es la explosión combinacional de términos a medida que el tamaño de los sistemas a estudiar crece, lo cual conlleva problemas de escalabilidad. Para afrontar este problema presen tamos una técnica de inyección de ruidos agrupados que hace grupos con las señales del sistema, introduce las fuentes de ruido para cada uno de los grupos por separado y finalmente combina los resultados de cada uno de ellos. De esta forma, el número de fuentes de ruido queda controlado en cada momento y, debido a ello, la explosión combinatoria se minimiza. También presentamos un algoritmo de particionado multi-vía destinado a minimizar la desviación de los resultados a causa de la pérdida de correlación entre términos de ruido con el objetivo de mantener los resultados tan precisos como sea posible. La presente Tesis Doctoral también aborda el desarrollo de metodologías de optimización de anchura de palabra basadas en simulaciones de Monte-Cario que se ejecuten en tiempos razonables. Para ello presentamos dos nuevas técnicas que exploran la reducción del tiempo de ejecución desde distintos ángulos: En primer lugar, el método interpolativo aplica un interpolador sencillo pero preciso para estimar la sensibilidad de cada señal, y que es usado después durante la etapa de optimización. En segundo lugar, el método incremental gira en torno al hecho de que, aunque es estrictamente necesario mantener un intervalo de confianza dado para los resultados finales de nuestra búsqueda, podemos emplear niveles de confianza más relajados, lo cual deriva en un menor número de pruebas por simulación, en las etapas iniciales de la búsqueda, cuando todavía estamos lejos de las soluciones optimizadas. Mediante estas dos aproximaciones demostramos que podemos acelerar el tiempo de ejecución de los algoritmos clásicos de búsqueda voraz en factores de hasta x240 para problemas de tamaño pequeño/mediano. Finalmente, este libro presenta HOPLITE, una infraestructura de cuantificación automatizada, flexible y modular que incluye la implementación de las técnicas anteriores y se proporciona de forma pública. Su objetivo es ofrecer a desabolladores e investigadores un entorno común para prototipar y verificar nuevas metodologías de cuantificación de forma sencilla. Describimos el flujo de trabajo, justificamos las decisiones de diseño tomadas, explicamos su API pública y hacemos una demostración paso a paso de su funcionamiento. Además mostramos, a través de un ejemplo sencillo, la forma en que conectar nuevas extensiones a la herramienta con las interfaces ya existentes para poder así expandir y mejorar las capacidades de HOPLITE. ABSTRACT Using fixed-point arithmetic is one of the most common design choices for systems where area, power or throughput are heavily constrained. In order to produce implementations where the cost is minimized without negatively impacting the accuracy of the results, a careful assignment of word-lengths is required. The problem of finding the optimal combination of fixed-point word-lengths for a given system is a combinatorial NP-hard problem to which developers devote between 25 and 50% of the design-cycle time. Reconfigurable hardware platforms such as FPGAs also benefit of the advantages of fixed-point arithmetic, as it compensates for the slower clock frequencies and less efficient area utilization of the hardware platform with respect to ASICs. As FPGAs become commonly used for scientific computation, designs constantly grow larger and more complex, up to the point where they cannot be handled efficiently by current signal and quantization noise modelling and word-length optimization methodologies. In this Ph.D. Thesis we explore different aspects of the quantization problem and we present new methodologies for each of them: The techniques based on extensions of intervals have allowed to obtain accurate models of the signal and quantization noise propagation in systems with non-linear operations. We take this approach a step further by introducing elements of MultiElement Generalized Polynomial Chaos (ME-gPC) and combining them with an stateof- the-art Statistical Modified Affine Arithmetic (MAA) based methodology in order to model systems that contain control-flow structures. Our methodology produces the different execution paths automatically, determines the regions of the input domain that will exercise them, and extracts the system statistical moments from the partial results. We use this technique to estimate both the dynamic range and the round-off noise in systems with the aforementioned control-flow structures. We show the good accuracy of our approach, which in some case studies with non-linear operators shows a 0.04 % deviation respect to the simulation-based reference values. A known drawback of the techniques based on extensions of intervals is the combinatorial explosion of terms as the size of the targeted systems grows, which leads to scalability problems. To address this issue we present a clustered noise injection technique that groups the signals in the system, introduces the noise terms in each group independently and then combines the results at the end. In this way, the number of noise sources in the system at a given time is controlled and, because of this, the combinato rial explosion is minimized. We also present a multi-way partitioning algorithm aimed at minimizing the deviation of the results due to the loss of correlation between noise terms, in order to keep the results as accurate as possible. This Ph.D. Thesis also covers the development of methodologies for word-length optimization based on Monte-Carlo simulations in reasonable times. We do so by presenting two novel techniques that explore the reduction of the execution times approaching the problem in two different ways: First, the interpolative method applies a simple but precise interpolator to estimate the sensitivity of each signal, which is later used to guide the optimization effort. Second, the incremental method revolves on the fact that, although we strictly need to guarantee a certain confidence level in the simulations for the final results of the optimization process, we can do it with more relaxed levels, which in turn implies using a considerably smaller amount of samples, in the initial stages of the process, when we are still far from the optimized solution. Through these two approaches we demonstrate that the execution time of classical greedy techniques can be accelerated by factors of up to ×240 for small/medium sized problems. Finally, this book introduces HOPLITE, an automated, flexible and modular framework for quantization that includes the implementation of the previous techniques and is provided for public access. The aim is to offer a common ground for developers and researches for prototyping and verifying new techniques for system modelling and word-length optimization easily. We describe its work flow, justifying the taken design decisions, explain its public API and we do a step-by-step demonstration of its execution. We also show, through an example, the way new extensions to the flow should be connected to the existing interfaces in order to expand and improve the capabilities of HOPLITE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thesis (Master's)--University of Washington, 2016-06

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The occurrence of rockbursts was quite common during active mining periods in the Champion reef mines of Kolar gold fields, India. Among the major rockbursts, the ‘area-rockbursts’ were unique both in regard to their spatio-temporal distribution and the extent of damage caused to the mine workings. A detailed study of the spatial clustering of 3 major area-rockbursts (ARB) was carried out using a multi-fractal technique involving generalized correlation integral functions. The spatial distribution analysis of all 3 area-rockbursts showed that they are heterogeneous. The degree of heterogeneity (D2 – D∞) in the cases of ARB-I, II and III were found to be 0.52, 0.37 and 0.41 respectively. These differences in fractal structure indicate that the ARBs of the present study were fully controlled by different heterogeneous stress fields associated with different mining and geological conditions. The present study clearly showed the advantages of the application of multi-fractals to seismic data and to characterise, analyse and examine the area-rockbursts and their causative factors in the Kolar gold mines.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A collection of case studies of individuals and organisations utilising open models in the Asia Pacific and associated regions. The case studies represent activities in nine countries, broader regions such as the Arab nations, and global efforts towards sustainability and social justice, revealing creative ways of participating in the commons. Featured are remix artists, performers, open source software programmers, film makers, collecting institutions and publishing houses focused on democracy and change, who demonstrate a diverse set of motivations to engage with the shared ideals of openness and community collaboration.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

International comparison is complicated by the use of different terms, classification methods, policy frameworks and system structures, not to mention different languages and terminology. Multi-case studies can assist in the understanding of the influence wielded by cultural, social, economic, historical and political forces upon educational decisions, policy construction and changes over time. But case studies alone are not enough. In this paper, we argue for an ecological or scaled approach that travels through macro, meso and micro levels to build nested case-studies to allow for more comprehensive analysis of the external and internal factors that shape policy-making and education systems. Such an approach allows for deeper understanding of the relationship between globalizing trends and policy developments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Existing research and best practice were utilized to develop the Project Management, Stakeholder Engagement and Change Facilitation (PSC) approach to road safety infrastructure projects. Two case studies involving Queensland Transport and Main Roads demonstrated that use of the PSC has potential to create synergies for projects undertaken by multi-disciplinary road safety groups, and to complement Safe System projects and philosophy. The case studies were the North West Road Safety Alliance project, and the implementation of Road Safety Audit policy, and utilised a mix of qualitative and quantitative methods including interviews and surveys.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Land application of piggery effluent (containing urine, faeces, water, and wasted feed) is under close scrutiny as a potential source of water resource contamination with phosphorus (P). This paper investigates two case studies of the impact of long-term piggery effluent-P application to soil. A Natrustalf (Sodosol) at P1 has received a net load of 3700 kg effluent P/ha over 19 years. The Haplustalf (Dermosol) selected (P2) has received a net load of 310 000 kg P/ha over 30 years. Total, bicarbonate extractable, and soluble P forms were determined throughout the soil profiles for paired (irrigated and unirrigated) sites at P1 and P2, as well as P sorption and desorption characteristics. Surface bicarbonate (PB, 0 - 0.05 m depth) and dilute CaCl2 extractable molybdate-reactive P (PC) have been significantly elevated by effluent irrigation (P1: PB unirrigated 23±1, irrigated 290±6; PC unirrigated 0.03±0.00, irrigated 23.9±0.2. P2: PB unirrigated 72±48, irrigated 3950±1960; PC unirrigated 0.7±0.0, irrigated 443±287 mg P/kg; mean±s.d.). Phosphorus enrichment to 1.5 m, detected as PB, was observed at P2. Elevated concentrations of CaCl2 extractable organic P forms (POC; estimated by non-molybdate reactive P in centrifuged supernatants) were observed from the soil surface of P1 to a depth of 0.4 m. Despite the extent of effluent application at both of these sites, only P1 displayed evidence of significant accumulation of POC. The increase in surface soil total P (0 - 0.05 m depth) due to effluent irrigation was much greater than laboratory P sorption (>25 times for P1; >57 times for P2) for a comparable range of final solution concentrations (desorption extracts ranged from 1-5 mg P/L for P1 and 50-80 mg P/L for P2). Precipitation of sparingly soluble P phases was evidenced in the soils of the P2 effluent application area.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The conventional Cornell's source-based approach of probabilistic seismic-hazard assessment (PSHA) has been employed all around the world, whilst many studies often rely on the use of computer packages such as FRISK (McGuire FRISK-a computer program for seismic risk analysis. Open-File Report 78-1007, United States Geological Survey, Department of Interior, Washington 1978) and SEISRISK III (Bender and Perkins SEISRISK III-a computer program for seismic hazard estimation, Bulletin 1772. United States Geological Survey, Department of Interior, Washington 1987). A ``black-box'' syndrome may be resulted if the user of the software does not have another simple and robust PSHA method that can be used to make comparisons. An alternative method for PSHA, namely direct amplitude-based (DAB) approach, has been developed as a heuristic and efficient method enabling users to undertake their own sanity checks on outputs from computer packages. This paper experiments the application of the DAB approach for three cities in China, Iran, and India, respectively, and compares with documented results computed by the source-based approach. Several insights regarding the procedure of conducting PSHA have also been obtained, which could be useful for future seismic-hazard studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As a potential poverty reduction and climate change strategy, this paper considers the advantages and disadvantages of using renewable energy technologies for rural electrification in developing countries. Although each case must be considered independently, given a reliable fuel source, renewable energy mini-grids powered by biomass gasifiers or micro-hydro plants appear to be the favoured option due to their lower levelised costs, provision of AC power, potential to provide a 24. h service and ability to host larger capacity systems that can power a wide range of electricity uses. Sustainability indicators are applied to three case studies in order to explore the extent to which sustainable welfare benefits can be created by renewable energy mini-grids. Policy work should focus on raising awareness about renewable energy mini-grids, improving institutional, technical and regulatory frameworks and developing innovative financing mechanisms to encourage private sector investments. Establishing joint technology and community engagement training centres should also be encouraged. © 2011 Elsevier Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Changes to software requirements not only pose a risk to the successful delivery of software applications but also provide opportunity for improved usability and value. Increased understanding of the causes and consequences of change can support requirements management and also make progress towards the goal of change anticipation. This paper presents the results of two case studies that address objectives arising from that ultimate goal. The first case study evaluated the potential of a change source taxonomy containing the elements ‘market’, ‘organisation’, ‘vision’, ‘specification’, and ‘solution’ to provide a meaningful basis for change classification and measurement. The second case study investigated whether the requirements attributes of novelty, complexity, and dependency correlated with requirements volatility. While insufficiency of data in the first case study precluded an investigation of changes arising due to the change source of ‘market’, for the remainder of the change sources, results indicate a significant difference in cost, value to the customer and management considerations. Findings show that higher cost and value changes arose more often from ‘organisation’ and ‘vision’ sources; these changes also generally involved the co-operation of more stakeholder groups and were considered to be less controllable than changes arising from the ‘specification’ or ‘solution’ sources. Results from the second case study indicate that only ‘requirements dependency’ is consistently correlated with volatility and that changes coming from each change source affect different groups of requirements. We conclude that the taxonomy can provide a meaningful means of change classification, but that a single requirement attribute is insufficient for change prediction. A theoretical causal account of requirements change is drawn from the implications of the combined results of the two case studies.