158 resultados para LARGE SYSTEMS
Resumo:
This paper addresses less recognised factors which influence the diffusion of a particular technology. While an innovation’s attributes and performance are paramount, many fail because of external factors which favour an alternative. This paper, with theoretic input from diffusion, lock-in and path-dependency, presents a qualitative study of external factors that influenced the evolution of transportation in USA. This historical account reveals how one technology and its emergent systems become dominant while other choices are overridden by socio-political, economic and technological interests which include not just the manufacturing and service industries associated with the automobile but also government and market stakeholders. Termed here as a large socio-economic regime (LSER),its power in ensuring lock-in and continued path-dependency is shown to pass through three stages, weakening eventually as awareness improves. The study extends to transport trends in China, Korea, Indonesia and Malaysia and they all show the dominant role of an LSER. As transportation policy is increasingly accountable to address both demand and environmental concerns and innovators search for solutions, this paper presents important knowledge for innovators, marketers and policy makers for commercial and societal reasons, especially when negative externalities associated with an incumbent transportation technology may lead to market failure.
Resumo:
Pre-service teacher education institutions are large and complex organizations, which are notoriously difficult to change. One factor is that many change efforts focus largely on individual pre-service teacher educators altering their practice. We report here on our experience using a model for effecting change, which views pre-service teacher education institutions and educators as a part of a much broader system. We identified numerous possibilities for, and constraints on, embedding change, but focus only on two in this paper: participants’ knowledge of change strategies and their leadership capacities. As a result of our study findings and researcher reflections, we argue that being a leader in an academic area within pre-service teacher education does not equate to leadership knowledge or skills to initiate and enact systems-wide change. Furthermore, such leadership capacities must be explicitly developed if education for sustainability is to become embedded in pre-service teacher education.
Resumo:
There is an increasing need in biology and clinical medicine to robustly and reliably measure tens-to-hundreds of peptides and proteins in clinical and biological samples with high sensitivity, specificity, reproducibility and repeatability. Previously, we demonstrated that LC-MRM-MS with isotope dilution has suitable performance for quantitative measurements of small numbers of relatively abundant proteins in human plasma, and that the resulting assays can be transferred across laboratories while maintaining high reproducibility and quantitative precision. Here we significantly extend that earlier work, demonstrating that 11 laboratories using 14 LC-MS systems can develop, determine analytical figures of merit, and apply highly multiplexed MRM-MS assays targeting 125 peptides derived from 27 cancer-relevant proteins and 7 control proteins to precisely and reproducibly measure the analytes in human plasma. To ensure consistent generation of high quality data we incorporated a system suitability protocol (SSP) into our experimental design. The SSP enabled real-time monitoring of LC-MRM-MS performance during assay development and implementation, facilitating early detection and correction of chromatographic and instrumental problems. Low to sub-nanogram/mL sensitivity for proteins in plasma was achieved by one-step immunoaffinity depletion of 14 abundant plasma proteins prior to analysis. Median intra- and inter-laboratory reproducibility was <20%, sufficient for most biological studies and candidate protein biomarker verification. Digestion recovery of peptides was assessed and quantitative accuracy improved using heavy isotope labeled versions of the proteins as internal standards. Using the highly multiplexed assay, participating laboratories were able to precisely and reproducibly determine the levels of a series of analytes in blinded samples used to simulate an inter-laboratory clinical study of patient samples. Our study further establishes that LC-MRM-MS using stable isotope dilution, with appropriate attention to analytical validation and appropriate quality c`ontrol measures, enables sensitive, specific, reproducible and quantitative measurements of proteins and peptides in complex biological matrices such as plasma.
Resumo:
This important new book draws lessons from a large-scale initiative to bring about the improvement of an urban education system. Written from an insider perspective by an internationally recognized researcher, it presents a new way of thinking about system change. This builds on the idea that there are untapped resources within schools and the communities they serve that can be mobilized in order to transform schools from places that do well for some children so that they can do well for many more. Towards Self-improving School Systems presents a strategic framework that can help to foster new, more fruitful working relationships: between national and local government; within and between schools; and between schools and their local communities. What is distinctive in the approach is that this is mainly led from within schools, with senior staff having a central role as system leaders. The book will be relevant to a wide range of readers throughout the world who are concerned with the strengthening of their national educational systems, including teachers, school leaders, policy makers and researchers. The argument it presents is particularly important for the growing number of countries where increased emphasis on school autonomy, competition and choice is leading to fragmentation within education provision.
Resumo:
Lake Purrumbete maar is located in the intraplate, monogenetic Newer Volcanics Province in southeastern Australia. The extremely large crater of 3000. m in diameter formed on an intersection of two fault lines and comprises at least three coalesced vents. The evolution of these vents is controlled by the interaction of the tectonic setting and the properties of both hard and soft rock aquifers. Lithics in the maar deposits originate from country rock formations less than 300. m deep, indicating that the large size of the crater cannot only be the result of the downwards migration of the explosion foci in a single vent. Vertical crater walls and primary inward dipping beds evidence that the original size of the crater has been largely preserved. Detailed mapping of the facies distributions, the direction of transport of base surges and pyroclastic flows, and the distribution of ballistic block fields, form the basis for the reconstruction of the complex eruption history,which is characterised by alternations of the eruption style between relatively dry and wet phreatomagmatic conditions, and migration of the vent location along tectonic structures. Three temporally separated eruption phases are recognised, each starting at the same crater located directly at the intersection of two local fault lines. Activity then moved quickly to different locations. A significant volcanic hiatus between two of the three phases shows that the magmatic system was reactivated. The enlargement of especially the main crater by both lateral and vertical growth led to the interception of the individual craters and the formation of the large circular crater. Lake Purrumbete maar is an excellent example of how complicated the evolution of large, seemingly simple, circular maar volcanoes can be, and raises the question if these systems are actually monogenetic.
Resumo:
Public buildings and large infrastructure are typically monitored by tens or hundreds of cameras, all capturing different physical spaces and observing different types of interactions and behaviours. However to date, in large part due to limited data availability, crowd monitoring and operational surveillance research has focused on single camera scenarios which are not representative of real-world applications. In this paper we present a new, publicly available database for large scale crowd surveillance. Footage from 12 cameras for a full work day covering the main floor of a busy university campus building, including an internal and external foyer, elevator foyers, and the main external approach are provided; alongside annotation for crowd counting (single or multi-camera) and pedestrian flow analysis for 10 and 6 sites respectively. We describe how this large dataset can be used to perform distributed monitoring of building utilisation, and demonstrate the potential of this dataset to understand and learn the relationship between different areas of a building.
Resumo:
Moreton Island and several other large siliceous sand dune islands and mainland barrier deposits in SE Queensland represent the distal, onshore component of an extensive Quaternary continental shelf sediment system. This sediment has been transported up to 1000 km along the coast and shelf of SE Australia over multiple glacioeustatic sea-level cycles. Stratigraphic relationships and a preliminary Optically Stimulated Luminance (OSL) chronology for Moreton Island indicate a middle Pleistocene age for the large majority of the deposit. Dune units exposed in the centre of the island and on the east coast have OSL ages that indicate deposition occurred between approximately 540 ka and 350 ka BP, and at around 96±10 ka BP. Much of the southern half of the island has a veneer of much younger sediment, with OSL ages of 0.90±0.11 ka, 1.28±0.16 ka, 5.75±0.53 ka and <0.45 ka BP. The younger deposits were partially derived from the reworking of the upper leached zone of the much older dunes. A large parabolic dune at the northern end of the island, OSL age of 9.90±1.0 ka BP, and palaeosol exposures that extend below present sea level suggest the Pleistocene dunes were sourced from shorelines positioned several to tens of metres lower than, and up to few kilometres seaward of the present shoreline. Given the lower gradient of the inner shelf a few km seaward of the island, it seems likely that periods of intermediate sea level (e.g. ~20 m below present) produced strongly positive onshore sediment budgets and the mobilisation of dunes inland to form much of what now comprises Moreton Island. The new OSL ages and comprehensive OSL chronology for the Cooloola deposit, 100 km north of Moreton Island, indicate that the bulk of the coastal dune deposits in SE Queensland were emplaced between approximately 540 ka BP and prior to the Last Interglacial. This chronostratigraphic information improves our fundamental understanding of long-term sediment transport and accumulation on large-scale continental shelf sediment systems.
Resumo:
In the mining optimisation literature, most researchers focused on two strategic-level and tactical-level open-pit mine optimisation problems, which are respectively termed ultimate pit limit (UPIT) or constrained pit limit (CPIT). However, many researchers indicate that the substantial numbers of variables and constraints in real-world instances (e.g., with 50-1000 thousand blocks) make the CPIT’s mixed integer programming (MIP) model intractable for use. Thus, it becomes a considerable challenge to solve the large scale CPIT instances without relying on exact MIP optimiser as well as the complicated MIP relaxation/decomposition methods. To take this challenge, two new graph-based algorithms based on network flow graph and conjunctive graph theory are developed by taking advantage of problem properties. The performance of our proposed algorithms is validated by testing recent large scale benchmark UPIT and CPIT instances’ datasets of MineLib in 2013. In comparison to best known results from MineLib, it is shown that the proposed algorithms outperform other CPIT solution approaches existing in the literature. The proposed graph-based algorithms leads to a more competent mine scheduling optimisation expert system because the third-party MIP optimiser is no longer indispensable and random neighbourhood search is not necessary.
Resumo:
Distributed systems are widely used for solving large-scale and data-intensive computing problems, including all-to-all comparison (ATAC) problems. However, when used for ATAC problems, existing computational frameworks such as Hadoop focus on load balancing for allocating comparison tasks, without careful consideration of data distribution and storage usage. While Hadoop-based solutions provide users with simplicity of implementation, their inherent MapReduce computing pattern does not match the ATAC pattern. This leads to load imbalances and poor data locality when Hadoop's data distribution strategy is used for ATAC problems. Here we present a data distribution strategy which considers data locality, load balancing and storage savings for ATAC computing problems in homogeneous distributed systems. A simulated annealing algorithm is developed for data distribution and task scheduling. Experimental results show a significant performance improvement for our approach over Hadoop-based solutions.
Resumo:
The requirement of distributed computing of all-to-all comparison (ATAC) problems in heterogeneous systems is increasingly important in various domains. Though Hadoop-based solutions are widely used, they are inefficient for the ATAC pattern, which is fundamentally different from the MapReduce pattern for which Hadoop is designed. They exhibit poor data locality and unbalanced allocation of comparison tasks, particularly in heterogeneous systems. The results in massive data movement at runtime and ineffective utilization of computing resources, affecting the overall computing performance significantly. To address these problems, a scalable and efficient data and task distribution strategy is presented in this paper for processing large-scale ATAC problems in heterogeneous systems. It not only saves storage space but also achieves load balancing and good data locality for all comparison tasks. Experiments of bioinformatics examples show that about 89\% of the ideal performance capacity of the multiple machines have be achieved through using the approach presented in this paper.
Resumo:
To date, a number of two-dimensional (2D) topological insulators (TIs) have been realized in Group 14 elemental honeycomb lattices, but all are inversionsymmetric. Here, based on first-principles calculations, we predict a new family of 2D inversion-asymmetric TIs with sizeable bulk gaps from 105 meV to 284 meV, in X2–GeSn (X = H, F, Cl, Br, I) monolayers, making them in principle suitable for room-temperature applications. The nontrivial topological characteristics of inverted band orders are identified in pristine X2–GeSn with X = (F, Cl, Br, I), whereas H2–GeSn undergoes a nontrivial band inversion at 8% lattice expansion. Topologically protected edge states are identified in X2–GeSn with X = (F, Cl, Br, I), as well as in strained H2–GeSn. More importantly, the edges of these systems, which exhibit single-Dirac-cone characteristics located exactly in the middle of their bulk band gaps, are ideal for dissipationless transport. Thus, Group 14 elemental honeycomb lattices provide a fascinating playground for the manipulation of quantum states.
Resumo:
The world has experienced a large increase in the amount of available data. Therefore, it requires better and more specialized tools for data storage and retrieval and information privacy. Recently Electronic Health Record (EHR) Systems have emerged to fulfill this need in health systems. They play an important role in medicine by granting access to information that can be used in medical diagnosis. Traditional systems have a focus on the storage and retrieval of this information, usually leaving issues related to privacy in the background. Doctors and patients may have different objectives when using an EHR system: patients try to restrict sensible information in their medical records to avoid misuse information while doctors want to see as much information as possible to ensure a correct diagnosis. One solution to this dilemma is the Accountable e-Health model, an access protocol model based in the Information Accountability Protocol. In this model patients are warned when doctors access their restricted data. They also enable a non-restrictive access for authenticated doctors. In this work we use FluxMED, an EHR system, and augment it with aspects of the Information Accountability Protocol to address these issues. The Implementation of the Information Accountability Framework (IAF) in FluxMED provides ways for both patients and physicians to have their privacy and access needs achieved. Issues related to storage and data security are secured by FluxMED, which contains mechanisms to ensure security and data integrity. The effort required to develop a platform for the management of medical information is mitigated by the FluxMED's workflow-based architecture: the system is flexible enough to allow the type and amount of information being altered without the need to change in your source code.
Resumo:
Analyzing and redesigning business processes is a complex task, which requires the collaboration of multiple actors. Current approaches focus on collaborative modeling workshops where process stakeholders verbally contribute their perspective on a process while modeling experts translate their contributions and integrate them into a model using traditional input devices. Limiting participants to verbal contributions not only affects the outcome of collaboration but also collaboration itself. We created CubeBPM – a system that allows groups of actors to interact with process models through a touch based interface on a large interactive touch display wall. We are currently in the process of conducting a study that aims at assessing the impact of CubeBPM on collaboration and modeling performance. Initial results presented in this paper indicate that the setting helped participants to become more active in collaboration.
Resumo:
Analyzing and redesigning business processes is a complex task, which requires the collaboration of multiple actors. Current approaches focus on collaborative modeling workshops where process stakeholders verbally contribute their perspective on a process while modeling experts translate their contributions and integrate them into a model using traditional input devices. Limiting participants to verbal contributions not only affects the outcome of collaboration but also collaboration itself. We created CubeBPM – a system that allows groups of actors to interact with process models through a touch based interface on a large interactive touch display wall. We are currently in the process of conducting a study that aims at assessing the impact of CubeBPM on collaboration and modeling performance. Initial results presented in this paper indicate that the setting helped participants to become more active in collaboration.
Resumo:
Analyzing and redesigning business processes is a complex task, which requires the collaboration of multiple actors. Current approaches focus on workshops where process stakeholders together with modeling experts create a graphical visualization of a process in a model. Within these workshops, stakeholders are mostly limited to verbal contributions, which are integrated into a process model by a modeling expert using traditional input devices. This limitation negatively affects the collaboration outcome and also the perception of the collaboration itself. In order to overcome this problem we created CubeBPM – a system that allows groups of actors to interact with process models through a touch based interface on a large interactive touch display wall. Using this system for collaborative modeling, we expect to provide a more effective collaboration environment thus improving modeling performance and collaboration.