146 resultados para landslides, riskanalysis, landslide hazard, fuzzy-logic


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mass flows on volcanic islands generated by volcanic lava dome collapse and by larger-volume flank collapse can be highly dangerous locally and may generate tsunamis that threaten a wider area. It is therefore important to understand their frequency, emplacement dynamics, and relationship to volcanic eruption cycles. The best record of mass flow on volcanic islands may be found offshore, where most material is deposited and where intervening hemipelagic sediment aids dating. Here we analyze what is arguably the most comprehensive sediment core data set collected offshore from a volcanic island. The cores are located southeast of Montserrat, on which the Soufriere Hills volcano has been erupting since 1995. The cores provide a record of mass flow events during the last 110 thousand years. Older mass flow deposits differ significantly from those generated by the repeated lava dome collapses observed since 1995. The oldest mass flow deposit originated through collapse of the basaltic South Soufriere Hills at 103-110 ka, some 20-30 ka after eruptions formed this volcanic center. A ∼1.8 km3 blocky debris avalanche deposit that extends from a chute in the island shelf records a particularly deep-seated failure. It likely formed from a collapse of almost equal amounts of volcanic edifice and coeval carbonate shelf, emplacing a mixed bioclastic-andesitic turbidite in a complex series of stages. This study illustrates how volcanic island growth and collapse involved extensive, large-volume submarine mass flows with highly variable composition. Runout turbidites indicate that mass flows are emplaced either in multiple stages or as single events.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Increasing global competition, rapid technological changes, advances in manufacturing and information technology and discerning customers are forcing supply chains to adopt improvement practices that enable them to deliver high quality products at a lower cost and in a shorter period of time. A lean initiative is one of the most effective approaches toward achieving this goal. In the lean improvement process, it is critical to measure current and desired performance level in order to clearly evaluate the lean implementation efforts. Many attempts have tried to measure supply chain performance incorporating both quantitative and qualitative measures but failed to provide an effective method of measuring improvements in performances for dynamic lean supply chain situations. Therefore, the necessity of appropriate measurement of lean supply chain performance has become imperative. There are many lean tools available for supply chains; however, effectiveness of a lean tool depends on the type of the product and supply chain. One tool may be highly effective for a supply chain involved in high volume products but may not be effective for low volume products. There is currently no systematic methodology available for selecting appropriate lean strategies based on the type of supply chain and market strategy This thesis develops an effective method to measure the performance of supply chain consisting of both quantitative and qualitative metrics and investigates the effects of product types and lean tool selection on the supply chain performance Supply chain performance matrices and the effects of various lean tools over performance metrics mentioned in the SCOR framework have been investigated. A lean supply chain model based on the SCOR metric framework is then developed where non- lean and lean as well as quantitative and qualitative metrics are incorporated in appropriate metrics. The values of appropriate metrics are converted into triangular fuzzy numbers using similarity rules and heuristic methods. Data have been collected from an apparel manufacturing company for multiple supply chain products and then a fuzzy based method is applied to measure the performance improvements in supply chains. Using the fuzzy TOPSIS method, which chooses an optimum alternative to maximise similarities with positive ideal solutions and to minimise similarities with negative ideal solutions, the performances of lean and non- lean supply chain situations for three different apparel products have been evaluated. To address the research questions related to effective performance evaluation method and the effects of lean tools over different types of supply chains; a conceptual framework and two hypotheses are investigated. Empirical results show that implementation of lean tools have significant effects over performance improvements in terms of time, quality and flexibility. Fuzzy TOPSIS based method developed is able to integrate multiple supply chain matrices onto a single performance measure while lean supply chain model incorporates qualitative and quantitative metrics. It can therefore effectively measure the improvements for supply chain after implementing lean tools. It is demonstrated that product types involved in the supply chain and ability to select right lean tools have significant effect on lean supply chain performance. Future study can conduct multiple case studies in different contexts.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Multi-Objective optimization for designing of a benchmark cogeneration system known as CGAM cogeneration system has been performed. In optimization approach, the thermoeconomic and Environmental aspects have been considered, simultaneously. The environmental objective function has been defined and expressed in cost terms. One of the most suitable optimization techniques developed using a particular class of search algorithms known as; Multi-Objective Particle Swarm Optimization (MOPSO) algorithm has been used here. This approach has been applied to find the set of Pareto optimal solutions with respect to the aforementioned objective functions. An example of fuzzy decision-making with the aid of Bellman-Zadeh approach has been presented and a final optimal solution has been introduced.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper makes a case for thinking about the primary school as a logic machine (apparatus) as a way of thinking about processes of in-school stratification. Firstly we discuss related literature on in-school stratification in primary schools, particularly as it relates to literacy learning. Secondly we explain how school reform can be thought about in terms of the idea of the machine or apparatus. In which case the processes of in-school stratification can be mapped as more than simply concerns about school organisation (such as students grouping) but also involve a politics of truth, played out in each school, that constitutes school culture and what counts as ‘good’ pedagogy. Thirdly, the chapter will focus specifically on research conducted into primary schools in the Northern Suburbs of Adelaide, one of the most educationally disadvantaged regions in Australia, as a case study of the relationship between in-school stratification and the reproduction of inequality. We will draw from more than 20 years of ethnographic work in primary school in the northern suburbs of Adelaide and provide a snapshot of a recent attempt to improve literacy achievement in a few Northern Suburbs public primary schools (SILA project). The SILA project, through diagnostic reviews, has provided a significant analysis of the challenges facing policy and practice in such challenging school contexts that also maps onto existing (inter)national research. These diagnostic reviews said ‘hard things’ that required attention by SILA schools and these included: · an over reliance on whole class, low level, routine tasks and hence a lack of challenge and rigour in the learning tasks offered to students ; · a focus on the 'code breaking' function of language at the expense of richer conceptualisations of literacy that might guide teachers’ understanding of challenging pedagogies ; · the need for substantial shifts in the culture of schools, especially unsettling deficit views of students and their communities ; · a need to provide a more ‘consistent’ approach to teaching literacy across the school; · a need to focus School Improvement Plans in order to implement a clear focus on literacy learning; and, · a need to sustain professional learning to produce new knowledge and practice . The paper will conclude with suggestions for further research and possible reform projects into the primary school as a logic machine.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Latex allergy is a serious, possibly life threatening health hazard in the perioperative environment. Policy and procedures should be developed to identify patients who may be sensitive to latex and to ensure the avoidance of latex products in their care. Healthcare workers should also take steps to avoid exposure and protect themselves from hypersensitivity reactions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cryptosystems based on the hardness of lattice problems have recently acquired much importance due to their average-case to worst-case equivalence, their conjectured resistance to quantum cryptanalysis, their ease of implementation and increasing practicality, and, lately, their promising potential as a platform for constructing advanced functionalities. In this work, we construct “Fuzzy” Identity Based Encryption from the hardness of the Learning With Errors (LWE) problem. We note that for our parameters, the underlying lattice problems (such as gapSVP or SIVP) are assumed to be hard to approximate within supexponential factors for adversaries running in subexponential time. We give CPA and CCA secure variants of our construction, for small and large universes of attributes. All our constructions are secure against selective-identity attacks in the standard model. Our construction is made possible by observing certain special properties that secret sharing schemes need to satisfy in order to be useful for Fuzzy IBE. We also discuss some obstacles towards realizing lattice-based attribute-based encryption (ABE).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

At Eurocrypt’04, Freedman, Nissim and Pinkas introduced a fuzzy private matching problem. The problem is defined as follows. Given two parties, each of them having a set of vectors where each vector has T integer components, the fuzzy private matching is to securely test if each vector of one set matches any vector of another set for at least t components where t < T. In the conclusion of their paper, they asked whether it was possible to design a fuzzy private matching protocol without incurring a communication complexity with the factor (T t ) . We answer their question in the affirmative by presenting a protocol based on homomorphic encryption, combined with the novel notion of a share-hiding error-correcting secret sharing scheme, which we show how to implement with efficient decoding using interleaved Reed-Solomon codes. This scheme may be of independent interest. Our protocol is provably secure against passive adversaries, and has better efficiency than previous protocols for certain parameter values.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Accurate prediction of incident duration is not only important information of Traffic Incident Management System, but also an ffective input for travel time prediction. In this paper, the hazard based prediction odels are developed for both incident clearance time and arrival time. The data are obtained from the Queensland Department of Transport and Main Roads’ STREAMS Incident Management System (SIMS) for one year ending in November 2010. The best fitting distributions are drawn for both clearance and arrival time for 3 types of incident: crash, stationary vehicle, and hazard. The results show that Gamma, Log-logistic, and Weibull are the best fit for crash, stationary vehicle, and hazard incident, respectively. The obvious impact factors are given for crash clearance time and arrival time. The quantitative influences for crash and hazard incident are presented for both clearance and arrival. The model accuracy is analyzed at the end.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The applications of organic semiconductors in complex circuitry such as printed CMOS-like logic circuits demand miniaturization of the active structures to the submicrometric and nanoscale level while enhancing or at least preserving the charge transport properties upon processing. Here, we addressed this issue by using a wet lithographic technique, which exploits and enhances the molecular order in polymers by spatial confinement, to fabricate ambipolar organic field effect transistors and inverter circuits based on nanostructured single component ambipolar polymeric semiconductor. In our devices, the current flows through a precisely defined array of nanostripes made of a highly ordered diketopyrrolopyrrole-benzothiadiazole copolymer with high charge carrier mobility (1.45 cm2 V-1 s-1 for electrons and 0.70 cm2 V-1 s-1 for holes). Finally, we demonstrated the functionality of the ambipolar nanostripe transistors by assembling them into an inverter circuit that exhibits a gain (105) comparable to inverters based on single crystal semiconductors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The business model concept is gaining traction in different disciplines but is still criticized for being fuzzy and vague and lacking consensus on its definition and compositional elements. In this paper we set out to advance our understanding of the business model concept by addressing three areas of foundational research: business model definitions, business model elements, and business model archetypes. We define a business model as a representation of the value logic of an organization in terms of how it creates and captures customer value. This abstract and generic definition is made more specific and operational by the compositional elements that need to address the customer, value proposition, organizational architecture (firm and network level) and economics dimensions. Business model archetypes complement the definition and elements by providing a more concrete and empirical understanding of the business model concept. The main contributions of this paper are (1) explicitly including the customer value concept in the business model definition and focussing on value creation, (2) presenting four core dimensions that business model elements need to cover, (3) arguing for flexibility by adapting and extending business model elements to cater for different purposes and contexts (e.g. technology, innovation, strategy),(4) stressing a more systematic approach to business model archetypes by using business model elements for their description, and (5) suggesting to use business model archetype research for the empirical exploration and testing of business model elements and their relationships.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Braking is a crucial driving task with a direct relationship with crash risk, as both excess and inadequate braking can lead to collisions. The objective of this study was to compare the braking profile of young drivers distracted by mobile phone conversations to non-distracted braking. In particular, the braking behaviour of drivers in response to a pedestrian entering a zebra crossing was examined using the CARRS-Q Advanced Driving Simulator. Thirty-two licensed drivers drove the simulator in three phone conditions: baseline (no phone conversation), hands-free, and handheld. In addition to driving the simulator, each participant completed questionnaires related to driver demographics, driving history, usage of mobile phones while driving, and general mobile phone usage history. The drivers were 18–26 years old and split evenly by gender. A linear mixed model analysis of braking profiles along the roadway before the pedestrian crossing revealed comparatively increased decelerations among distracted drivers, particularly during the initial 20 kph of deceleration. Drivers’ initial 20 kph deceleration time was modelled using a parametric accelerated failure time (AFT) hazard-based duration model with a Weibull distribution with clustered heterogeneity to account for the repeated measures experiment design. Factors found to significantly influence the braking task included vehicle dynamics variables like initial speed and maximum deceleration, phone condition, and driver-specific variables such as licence type, crash involvement history, and self-reported experience of using a mobile phone whilst driving. Distracted drivers on average appear to reduce the speed of their vehicle faster and more abruptly than non-distracted drivers, exhibiting excess braking comparatively and revealing perhaps risk compensation. The braking appears to be more aggressive for distracted drivers with provisional licenses compared to drivers with open licenses. Abrupt or excessive braking by distracted drivers might pose significant safety concerns to following vehicles in a traffic stream.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this book by two Australian authors is to: introduce the audience to the full complement of contextual elements found within program theory; offer practical suggestions to engage with theories of change, theories of action and logic models; and provide substantial evidence for this approach through scholarly literature, practice case studies together with the authors' combined experience of 60 years.