915 resultados para conservative scenario
Resumo:
Transport processes within heterogeneous media may exhibit non- classical diffusion or dispersion which is not adequately described by the classical theory of Brownian motion and Fick’s law. We consider a space-fractional advection-dispersion equation based on a fractional Fick’s law. Zhang et al. [Water Resources Research, 43(5)(2007)] considered such an equation with variable coefficients, which they dis- cretised using the finite difference method proposed by Meerschaert and Tadjeran [Journal of Computational and Applied Mathematics, 172(1):65-77 (2004)]. For this method the presence of variable coef- ficients necessitates applying the product rule before discretising the Riemann–Liouville fractional derivatives using standard and shifted Gru ̈nwald formulas, depending on the fractional order. As an alternative, we propose using a finite volume method that deals directly with the equation in conservative form. Fractionally-shifted Gru ̈nwald formulas are used to discretise the Riemann–Liouville fractional derivatives at control volume faces, eliminating the need for product rule expansions. We compare the two methods for several case studies, highlighting the convenience of the finite volume approach.
Resumo:
Bagasse stockpile operations have the potential to lead to adverse environmental and social impacts. Dust releases can cause occupational health and safety concerns for factory workers and dust emissions impact on the surrounding community. Preliminary modelling showed that bagasse depithing would likely reduce the environmental risks, particularly dust emissions, associated with large-scale bagasse stockpiling operations. Dust emission properties were measured and used for dispersion modelling with favourable outcomes. Modelling showed a 70% reduction in peak ground level concentrations of PM10 dust (particles with an aerodynamic diameter less than 10 μm) from operations on depithed bagasse stockpiles compared to similar operations on stockpiles of whole bagasse. However, the costs of a depithing operation at a sugar factory were estimated to be approximately $2.1 million in capital expenditure to process 100 000 t/y of bagasse and operating costs were 200 000 p.a. The total capital cost for a 10 000 t/y operation was approximately $1.6 million. The cost of depithing based on a discounted cash flow analysis was $5.50 per tonne of bagasse for the 100 000 t/y scenario. This may make depithing prohibitively expensive in many situations if installed exclusively as a dust control measure.
Resumo:
Scenario 1 A buys a two storey commercial building built along the only street frontage to the property. Vehicles cannot reach the rear of the property as the building extends across the entire width of the land. A bought the building with full knowledge that vehicular access to the rest of the property had been compromised by a desire to obtain maximum street frontage for the building which was occupied by a commercial tenant. On street parking is scarce in the surrounding area. A (to the knowledge of the adjoining owner B) constructs a carpark at the rear of the building. The employees of A’s tenant have been using the carpark obtaining access via a driveway on B’s land. To formalise this arrangement, A seeks a right of way for vehicles to travel down B’s driveway to access the carpark...
Resumo:
In the last years several works have investigated a formal model for Information Retrieval (IR) based on the mathematical formalism underlying quantum theory. These works have mainly exploited geometric and logical–algebraic features of the quantum formalism, for example entanglement, superposition of states, collapse into basis states, lattice relationships. In this poster I present an analogy between a typical IR scenario and the double slit experiment. This experiment exhibits the presence of interference phenomena between events in a quantum system, causing the Kolmogorovian law of total probability to fail. The analogy allows to put forward the routes for the application of quantum probability theory in IR. However, several questions need still to be addressed; they will be the subject of my PhD research
Resumo:
In this thesis we investigate the use of quantum probability theory for ranking documents. Quantum probability theory is used to estimate the probability of relevance of a document given a user's query. We posit that quantum probability theory can lead to a better estimation of the probability of a document being relevant to a user's query than the common approach, i. e. the Probability Ranking Principle (PRP), which is based upon Kolmogorovian probability theory. Following our hypothesis, we formulate an analogy between the document retrieval scenario and a physical scenario, that of the double slit experiment. Through the analogy, we propose a novel ranking approach, the quantum probability ranking principle (qPRP). Key to our proposal is the presence of quantum interference. Mathematically, this is the statistical deviation between empirical observations and expected values predicted by the Kolmogorovian rule of additivity of probabilities of disjoint events in configurations such that of the double slit experiment. We propose an interpretation of quantum interference in the document ranking scenario, and examine how quantum interference can be effectively estimated for document retrieval. To validate our proposal and to gain more insights about approaches for document ranking, we (1) analyse PRP, qPRP and other ranking approaches, exposing the assumptions underlying their ranking criteria and formulating the conditions for the optimality of the two ranking principles, (2) empirically compare three ranking principles (i. e. PRP, interactive PRP, and qPRP) and two state-of-the-art ranking strategies in two retrieval scenarios, those of ad-hoc retrieval and diversity retrieval, (3) analytically contrast the ranking criteria of the examined approaches, exposing similarities and differences, (4) study the ranking behaviours of approaches alternative to PRP in terms of the kinematics they impose on relevant documents, i. e. by considering the extent and direction of the movements of relevant documents across the ranking recorded when comparing PRP against its alternatives. Our findings show that the effectiveness of the examined ranking approaches strongly depends upon the evaluation context. In the traditional evaluation context of ad-hoc retrieval, PRP is empirically shown to be better or comparable to alternative ranking approaches. However, when we turn to examine evaluation contexts that account for interdependent document relevance (i. e. when the relevance of a document is assessed also with respect to other retrieved documents, as it is the case in the diversity retrieval scenario) then the use of quantum probability theory and thus of qPRP is shown to improve retrieval and ranking effectiveness over the traditional PRP and alternative ranking strategies, such as Maximal Marginal Relevance, Portfolio theory, and Interactive PRP. This work represents a significant step forward regarding the use of quantum theory in information retrieval. It demonstrates in fact that the application of quantum theory to problems within information retrieval can lead to improvements both in modelling power and retrieval effectiveness, allowing the constructions of models that capture the complexity of information retrieval situations. Furthermore, the thesis opens up a number of lines for future research. These include: (1) investigating estimations and approximations of quantum interference in qPRP; (2) exploiting complex numbers for the representation of documents and queries, and; (3) applying the concepts underlying qPRP to tasks other than document ranking.
Creation of a new evaluation benchmark for information retrieval targeting patient information needs
Resumo:
Searching for health advice on the web is becoming increasingly common. Because of the great importance of this activity for patients and clinicians and the effect that incorrect information may have on health outcomes, it is critical to present relevant and valuable information to a searcher. Previous evaluation campaigns on health information retrieval (IR) have provided benchmarks that have been widely used to improve health IR and record these improvements. However, in general these benchmarks have targeted the specialised information needs of physicians and other healthcare workers. In this paper, we describe the development of a new collection for evaluation of effectiveness in IR seeking to satisfy the health information needs of patients. Our methodology features a novel way to create statements of patients’ information needs using realistic short queries associated with patient discharge summaries, which provide details of patient disorders. We adopt a scenario where the patient then creates a query to seek information relating to these disorders. Thus, discharge summaries provide us with a means to create contextually driven search statements, since they may include details on the stage of the disease, family history etc. The collection will be used for the first time as part of the ShARe/-CLEF 2013 eHealth Evaluation Lab, which focuses on natural language processing and IR for clinical care.
Resumo:
This editorial depicts the current challenges in palliative care provision for patients with a haematological malignancy and the contribution of cancer nurses. There have been significant advancements in the care of patients with a hematological malignancy over the past three or more decades1. Despite this, there still exists a significant mortality risk in curative treatment and many patients with a hematological malignancy will die from their disease1. A growing body of research indicates patients with a hematological malignancy do not receive best practice palliative and end-of-life care2. Shortfalls in care include poor referral patterns to specialist palliative care services, lack of honest discussions regarding death and dying, inadequate spiritual care for patients and families, patients frequently dying in the acute care setting and high levels of patient and family distress2. There have been a number of efforts in the United Kingdom, United States of America, Sweden, and Australia demonstrating palliative and hematology care can co-exist, exemplified through clinical case studies and innovative models of care2. However, deficits in the provision of palliative care for patients with a hematological malignancy persist as evident in the international literature2. Addressing this issue requires research exploring new aspects of a complex scenario; here we suggest priority areas of research...
Resumo:
Geoscientists are confronted with the challenge of assessing nonlinear phenomena that result from multiphysics coupling across multiple scales from the quantum level to the scale of the earth and from femtoseconds to the 4.5 Ga of history of our planet. We neglect in this review electromagnetic modelling of the processes in the Earth’s core, and focus on four types of couplings that underpin fundamental instabilities in the Earth. These are thermal (T), hydraulic (H), mechanical (M) and chemical (C) processes which are driven and controlled by the transfer of heat to the Earth’s surface. Instabilities appear as faults, folds, compaction bands, shear/fault zones, plate boundaries and convective patterns. Convective patterns emerge from buoyancy overcoming viscous drag at a critical Rayleigh number. All other processes emerge from non-conservative thermodynamic forces with a critical critical dissipative source term, which can be characterised by the modified Gruntfest number Gr. These dissipative processes reach a quasi-steady state when, at maximum dissipation, THMC diffusion (Fourier, Darcy, Biot, Fick) balance the source term. The emerging steady state dissipative patterns are defined by the respective diffusion length scales. These length scales provide a fundamental thermodynamic yardstick for measuring instabilities in the Earth. The implementation of a fully coupled THMC multiscale theoretical framework into an applied workflow is still in its early stages. This is largely owing to the four fundamentally different lengths of the THMC diffusion yardsticks spanning micro-metre to tens of kilometres compounded by the additional necessity to consider microstructure information in the formulation of enriched continua for THMC feedback simulations (i.e., micro-structure enriched continuum formulation). Another challenge is to consider the important factor time which implies that the geomaterial often is very far away from initial yield and flowing on a time scale that cannot be accessed in the laboratory. This leads to the requirement of adopting a thermodynamic framework in conjunction with flow theories of plasticity. This framework allows, unlike consistency plasticity, the description of both solid mechanical and fluid dynamic instabilities. In the applications we show the similarity of THMC feedback patterns across scales such as brittle and ductile folds and faults. A particular interesting case is discussed in detail, where out of the fluid dynamic solution, ductile compaction bands appear which are akin and can be confused with their brittle siblings. The main difference is that they require the factor time and also a much lower driving forces to emerge. These low stress solutions cannot be obtained on short laboratory time scales and they are therefore much more likely to appear in nature than in the laboratory. We finish with a multiscale description of a seminal structure in the Swiss Alps, the Glarus thrust, which puzzled geologists for more than 100 years. Along the Glarus thrust, a km-scale package of rocks (nappe) has been pushed 40 km over its footwall as a solid rock body. The thrust itself is a m-wide ductile shear zone, while in turn the centre of the thrust shows a mm-cm wide central slip zone experiencing periodic extreme deformation akin to a stick-slip event. The m-wide creeping zone is consistent with the THM feedback length scale of solid mechanics, while the ultralocalised central slip zones is most likely a fluid dynamic instability.
Resumo:
In this chapter, we discuss four related areas of cryptology, namely, authentication, hashing, message authentication codes (MACs), and digital signatures. These topics represent active and growing research topics in cryptology. Space limitations allow us to concentrate only on the essential aspects of each topic. The bibliography is intended to supplement our survey. We have selected those items which providean overview of the current state of knowledge in the above areas. Authentication deals with the problem of providing assurance to a receiver that a communicated message originates from a particular transmitter, and that the received message has the same content as the transmitted message. A typical authentication scenario occurs in computer networks, where the identity of two communicating entities is established by means of authentication. Hashing is concerned with the problem of providing a relatively short digest–fingerprint of a much longer message or electronic document. A hashing function must satisfy (at least) the critical requirement that the fingerprints of two distinct messages are distinct. Hashing functions have numerous applications in cryptology. They are often used as primitives to construct other cryptographic functions. MACs are symmetric key primitives that provide message integrity against active spoofing by appending a cryptographic checksum to a message that is verifiable only by the intended recipient of the message. Message authentication is one of the most important ways of ensuring the integrity of information that is transferred by electronic means. Digital signatures provide electronic equivalents of handwritten signatures. They preserve the essential features of handwritten signatures and can be used to sign electronic documents. Digital signatures can potentially be used in legal contexts.
Resumo:
The 21st century will see monumental change. Either the human race will use its knowledge and skills and change the way it interacts with the environment, or the environment will change the way it interacts with its inhabitants. In the first case, the focus of this book, we would see our sophisticated understanding in areas such as physics, chemistry, engineering, biology, planning, commerce, business and governance accumulated over the last 1,000 years brought to bear on the challenge of dramatically reducing our pressure on the environment. The second case however is the opposite scenario, involving the decline of the planet’s ecosystems until they reach thresholds where recovery is not possible, and following which we have no idea what happens. For instance, if we fail to respond to Sir Nicolas Stern’s call to meet appropriate stabilisation trajectories for greenhouse gas emissions, and we allow the average temperature of our planets surface to increase by 4-6 degrees Celsius, we will see staggering changes to our environment, including rapidly rising sea level, withering crops, diminishing water reserves, drought, cyclones, floods… allowing this to happen will be the failure of our species, and those that survive will have a deadly legacy. In this update to the 1997 International Best Seller, Factor Four, Ernst von Weizsäcker again leads a team to present a compelling case for sector wide advances that can deliver significant resource productivity improvements over the coming century. The purpose of this book is to inspire hope and to then inform meaningful action in the coming decades to respond to the greatest challenge our species has ever faced – that of living in harmony with our planet and its other inhabitants.
Resumo:
Secure multi-party computation (MPC) protocols enable a set of n mutually distrusting participants P 1, ..., P n , each with their own private input x i , to compute a function Y = F(x 1, ..., x n ), such that at the end of the protocol, all participants learn the correct value of Y, while secrecy of the private inputs is maintained. Classical results in the unconditionally secure MPC indicate that in the presence of an active adversary, every function can be computed if and only if the number of corrupted participants, t a , is smaller than n/3. Relaxing the requirement of perfect secrecy and utilizing broadcast channels, one can improve this bound to t a < n/2. All existing MPC protocols assume that uncorrupted participants are truly honest, i.e., they are not even curious in learning other participant secret inputs. Based on this assumption, some MPC protocols are designed in such a way that after elimination of all misbehaving participants, the remaining ones learn all information in the system. This is not consistent with maintaining privacy of the participant inputs. Furthermore, an improvement of the classical results given by Fitzi, Hirt, and Maurer indicates that in addition to t a actively corrupted participants, the adversary may simultaneously corrupt some participants passively. This is in contrast to the assumption that participants who are not corrupted by an active adversary are truly honest. This paper examines the privacy of MPC protocols, and introduces the notion of an omnipresent adversary, which cannot be eliminated from the protocol. The omnipresent adversary can be either a passive, an active or a mixed one. We assume that up to a minority of participants who are not corrupted by an active adversary can be corrupted passively, with the restriction that at any time, the number of corrupted participants does not exceed a predetermined threshold. We will also show that the existence of a t-resilient protocol for a group of n participants, implies the existence of a t’-private protocol for a group of n′ participants. That is, the elimination of misbehaving participants from a t-resilient protocol leads to the decomposition of the protocol. Our adversary model stipulates that a MPC protocol never operates with a set of truly honest participants (which is a more realistic scenario). Therefore, privacy of all participants who properly follow the protocol will be maintained. We present a novel disqualification protocol to avoid a loss of privacy of participants who properly follow the protocol.
Resumo:
Convective downburst wind storms generate the peak annual gust wind speed for many parts of the non-cyclonic world at return periods of importance for ultimate limit state design. Despite this there is little clear understanding of how to appropriately design for these wind events given their significant dissimilarities to boundary layer winds upon which most design is based. To enhance the understanding of wind fields associated with these storms a three-dimensional numerical model was developed to simulate a multitude of idealised downburst scenarios and to investigate their near-ground wind characteristics. Stationary and translating downdraft wind events in still and sheared environments were simulated with baseline results showing good agreement with previous numerical work and full-scale observational data. Significant differences are shown in the normalised peak wind speed velocity profiles depending on the environmental wind conditions in the vicinity of the simulated event. When integrated over the height of mid- to high rise structures, all simulated profiles are shown to produce wind loads smaller than an equivalent 10 m height matched open terrain boundary layer profile. This suggests that for these structures the current design approach is conservative from an ultimate loading standpoint. Investigating the influence of topography on the structure of the simulated near-ground downburst wind fields, it is shown that these features amplify wind speeds in a manner similar to that expected for boundary layer winds, but the extent of amplification is reduced. The level of reduction is shown to be dependent on the depth of the simulated downburst outflow.
Resumo:
LiteSteel beam (LSB) is a new cold-formed steel hollow flange channel section produced using a patented manufacturing process involving simultaneous cold-forming and dual electric resistance welding. The LSBs were commonly used as floor joists and bearers with web openings in residential, industrial and commercial buildings. Due to the unique geometry of LSBs, as well as its unique residual stress characteristics and initial geometric imperfections resultant of manufacturing processes, much of the existing research for common cold-formed steel sections is not directly applicable to LSBs. Many research studies have been carried out to evaluate the behaviour and design of LSBs subject to pure bending actions, predominant shear and combined actions. However, to date, no investigation has been conducted into the web crippling behaviour and strength of LSB sections. Hence detailed experimental studies were conducted to investigate the web crippling behaviour and strengths of LSBs under EOF (End One Flange) and IOF (Interior One Flange) load cases. A total of 26 web crippling tests was conducted and the results were compared with current AS/NZS 4600 design rules. This comparison showed that AS/NZS 4600 (SA, 2005) design rules are very conservative for LSB sections under EOF and IOF load cases. Suitable design equations have been proposed to determine the web crippling capacity of LSBs based on experimental results. This paper presents the details of this experimental study on the web crippling behaviour and strengths of LiteSteel beams under EOF and IOF load cases.
Resumo:
Cold-formed steel members are increasingly used as primary structural elements in buildings due to the availability of thin and high strength steels and advanced cold-forming technologies. Cold-formed lipped channel beams (LCB) are commonly used as flexural members such as floor joists and bearers. Many research studies have been carried out to evaluate the behaviour and design of LCBs subject to pure bending actions. However, limited research has been undertaken on the shear behaviour and strength of LCBs. Hence a numerical study was undertaken to investigate the shear behaviour and strength of LCBs. Finite element models of simply supported LCBs with aspect ratios of 1.0 and 1.5 were considered under a mid-span load. They were then validated by comparing their results with test results and used in a detailed parametric study based on the validated finite element models. Numerical studies were conducted to investigate the shear buckling and post-buckling behaviour of LCBs. Experimental and numerical results showed that the current design rules in cold-formed steel structures design codes are very conservative for the shear design of LCBs. Improved design equations were therefore proposed for the shear strength of LCBs. This paper presents the details of this numerical study of LCBs and the results.
Resumo:
LiteSteel beam (LSB) is a cold-formed steel hollow flange channel section produced using a patented manufacturing process involving simultaneous cold-forming and dual electric resistance welding. It is commonly used as floor joists and bearers in residential, industrial and commercial buildings. Design of the LSB is governed by the Australian cold-formed steel structures code, AS/NZS 4600. Due to the geometry of the LSB, as well as its unique residual stress characteristics and initial geometric imperfections resultant of manufacturing processes, currently available design equations for common cold-formed sections are not directly applicable to the LSB. Many research studies have been carried out to evaluate the behaviour and design of LSBs subject to pure bending actions and predominant shear actions. To date, however, no investigation has been conducted into the strength of LSB sections under combined bending and shear actions. Hence experimental and numerical studies were conducted to assess the combined bending and shear behaviour of LSBs. Finite element models of LSBs were developed to simulate their combined bending and shear behaviour and strength of LSBs. They were then validated by comparing the results with available experimental test results and used in a detailed parametric study. The results from experimental and finite element analyses were compared with current AS/NZS 4600 and AS 4100 design rules. Both experimental and numerical studies show that the AS/NZS 4600 design rule based on circular interaction equation is conservative in predicting the combined bending and shear capacities of LSBs. This paper presents the details of the numerical studies of LSBs and the results. In response to the inadequacies of current approaches to designing LSBs for combined bending and shear, two lower bound design equations are proposed in this paper.