643 resultados para Mathematical Techniques - Integration
Resumo:
The continuous growth of the XML data poses a great concern in the area of XML data management. The need for processing large amounts of XML data brings complications to many applications, such as information retrieval, data integration and many others. One way of simplifying this problem is to break the massive amount of data into smaller groups by application of clustering techniques. However, XML clustering is an intricate task that may involve the processing of both the structure and the content of XML data in order to identify similar XML data. This research presents four clustering methods, two methods utilizing the structure of XML documents and the other two utilizing both the structure and the content. The two structural clustering methods have different data models. One is based on a path model and other is based on a tree model. These methods employ rigid similarity measures which aim to identifying corresponding elements between documents with different or similar underlying structure. The two clustering methods that utilize both the structural and content information vary in terms of how the structure and content similarity are combined. One clustering method calculates the document similarity by using a linear weighting combination strategy of structure and content similarities. The content similarity in this clustering method is based on a semantic kernel. The other method calculates the distance between documents by a non-linear combination of the structure and content of XML documents using a semantic kernel. Empirical analysis shows that the structure-only clustering method based on the tree model is more scalable than the structure-only clustering method based on the path model as the tree similarity measure for the tree model does not need to visit the parents of an element many times. Experimental results also show that the clustering methods perform better with the inclusion of the content information on most test document collections. To further the research, the structural clustering method based on tree model is extended and employed in XML transformation. The results from the experiments show that the proposed transformation process is faster than the traditional transformation system that translates and converts the source XML documents sequentially. Also, the schema matching process of XML transformation produces a better matching result in a shorter time.
Resumo:
Because of increased competition between healthcare providers, higher customer expectations, stringent checks on insurance payments and new government regulations, it has become vital for healthcare organisations to enhance the quality of the care they provide, to increase efficiency, and to improve the cost effectiveness of their services. Consequently, a number of quality management concepts and tools are employed in the healthcare domain to achieve the most efficient ways of using time, manpower, space and other resources. Emergency departments are designed to provide a high-quality medical service with immediate availability of resources to those in need of emergency care. The challenge of maintaining a smooth flow of patients in emergency departments is a global problem. This study attempts to improve the patient flow in emergency departments by considering Lean techniques and Six Sigma methodology in a comprehensive conceptual framework. The proposed research will develop a systematic approach through integration of Lean techniques with Six Sigma methodology to improve patient flow in emergency departments. The results reported in this paper are based on a standard questionnaire survey of 350 patients in the Emergency Department of Aseer Central Hospital in Saudi Arabia. The results of the study led us to determine the most significant variables affecting patient satisfaction with patient flow, including waiting time during patient treatment in the emergency department; effectiveness of the system when dealing with the patient’s complaints; and the layout of the emergency department. The proposed model will be developed within a performance evaluation metric based on these critical variables, to be evaluated in future work within fuzzy logic for continuous quality improvement.
Resumo:
Conservation of free-ranging cheetah (Acinonyx jubatus) populations is multi faceted and needs to be addressed from an ecological, biological and management perspective. There is a wealth of published research, each focusing on a particular aspect of cheetah conservation. Identifying the most important factors, making sense of various (and sometimes contrasting) findings, and taking decisions when little or no empirical data is available, are everyday challenges facing conservationists. Bayesian networks (BN) provide a statistical modeling framework that enables analysis and integration of information addressing different aspects of conservation. There has been an increased interest in the use of BNs to model conservation issues, however the development of more sophisticated BNs, utilizing object-oriented (OO) features, is still at the frontier of ecological research. We describe an integrated, parallel modeling process followed during a BN modeling workshop held in Namibia to combine expert knowledge and data about free-ranging cheetahs. The aim of the workshop was to obtain a more comprehensive view of the current viability of the free-ranging cheetah population in Namibia, and to predict the effect different scenarios may have on the future viability of this free-ranging cheetah population. Furthermore, a complementary aim was to identify influential parameters of the model to more effectively target those parameters having the greatest impact on population viability. The BN was developed by aggregating diverse perspectives from local and independent scientists, agents from the national ministry, conservation agency members and local fieldworkers. This integrated BN approach facilitates OO modeling in a multi-expert context which lends itself to a series of integrated, yet independent, subnetworks describing different scientific and management components. We created three subnetworks in parallel: a biological, ecological and human factors network, which were then combined to create a complete representation of free-ranging cheetah population viability. Such OOBNs have widespread relevance to the effective and targeted conservation management of vulnerable and endangered species.
Resumo:
We present a mini-review of the development and contemporary applications of diffusion-sensitive nuclear magnetic resonance (NMR) techniques in biomedical sciences. Molecular diffusion is a fundamental physical phenomenon present in all biological systems. Due to the connection between experimentally measured diffusion metrics and the microscopic environment sensed by the diffusing molecules, diffusion measurements can be used for characterisation of molecular size, molecular binding and association, and the morphology of biological tissues. The emergence of magnetic resonance was instrumental to the development of biomedical applications of diffusion. We discuss the fundamental physical principles of diffusion NMR spectroscopy and diffusion MR imaging. The emphasis is placed on conceptual understanding, historical evolution and practical applications rather than complex technical details. Mathematical description of diffusion is presented to the extent that it is required for the basic understanding of the concepts. We present a wide range of spectroscopic and imaging applications of diffusion magnetic resonance, including colloidal drug delivery vehicles; protein association; characterisation of cell morphology; neural fibre tractography; cardiac imaging; and the imaging of load-bearing connective tissues. This paper is intended as an accessible introduction into the exciting and growing field of diffusion magnetic resonance.
Resumo:
Trivium is a bit-based stream cipher in the final portfolio of the eSTREAM project. In this paper, we apply the algebraic attack approach of Berbain et al. to Trivium-like ciphers and perform new analyses on them. We demonstrate a new algebraic attack on Bivium-A. This attack requires less time and memory than previous techniques to recover Bivium-A's initial state. Though our attacks on Bivium-B, Trivium and Trivium-N are worse than exhaustive keysearch, the systems of equations which are constructed are smaller and less complex compared to previous algebraic analyses. We also answer an open question posed by Berbain et al. on the feasibility of applying their technique on Trivium-like ciphers. Factors which can affect the complexity of our attack on Trivium-like ciphers are discussed in detail. Analysis of Bivium-B and Trivium-N are omitted from this manuscript. The full paper is available on the IACR ePrint Archive.
Resumo:
Custom designed for display on the Cube Installation situated in the new Science and Engineering Centre (SEC) at QUT, the ECOS project is a playful interface that uses real-time weather data to simulate how a five-star energy building operates in climates all over the world. In collaboration with the SEC building managers, the ECOS Project incorporates energy consumption and generation data of the building into an interactive simulation, which is both engaging to users and highly informative, and which invites play and reflection on the roles of green buildings. ECOS focuses on the principle that humans can have both a positive and negative impact on ecosystems with both local and global consequence. The ECOS project draws on the practice of Eco-Visualisation, a term used to encapsulate the important merging of environmental data visualization with the philosophy of sustainability. Holmes (2007) uses the term Eco-Visualisation (EV) to refer to data visualisations that ‘display the real time consumption statistics of key environmental resources for the goal of promoting ecological literacy’. EVs are commonly artifacts of interaction design, information design, interface design and industrial design, but are informed by various intellectual disciplines that have shared interests in sustainability. As a result of surveying a number of projects, Pierce, Odom and Blevis (2008) outline strategies for designing and evaluating effective EVs, including ‘connecting behavior to material impacts of consumption, encouraging playful engagement and exploration with energy, raising public awareness and facilitating discussion, and stimulating critical reflection.’ Consequently, Froehlich (2010) and his colleagues also use the term ‘Eco-feedback technology’ to describe the same field. ‘Green IT’ is another variation which Tomlinson (2010) describes as a ‘field at the juncture of two trends… the growing concern over environmental issues’ and ‘the use of digital tools and techniques for manipulating information.’ The ECOS Project team is guided by these principles, but more importantly, propose an example for how these principles may be achieved. The ECOS Project presents a simplified interface to the very complex domain of thermodynamic and climate modeling. From a mathematical perspective, the simulation can be divided into two models, which interact and compete for balance – the comfort of ECOS’ virtual denizens and the ecological and environmental health of the virtual world. The comfort model is based on the study of psychometrics, and specifically those relating to human comfort. This provides baseline micro-climatic values for what constitutes a comfortable working environment within the QUT SEC buildings. The difference between the ambient outside temperature (as determined by polling the Google Weather API for live weather data) and the internal thermostat of the building (as set by the user) allows us to estimate the energy required to either heat or cool the building. Once the energy requirements can be ascertained, this is then balanced with the ability of the building to produce enough power from green energy sources (solar, wind and gas) to cover its energy requirements. Calculating the relative amount of energy produced by wind and solar can be done by, in the case of solar for example, considering the size of panel and the amount of solar radiation it is receiving at any given time, which in turn can be estimated based on the temperature and conditions returned by the live weather API. Some of these variables can be altered by the user, allowing them to attempt to optimize the health of the building. The variables that can be changed are the budget allocated to green energy sources such as the Solar Panels, Wind Generator and the Air conditioning to control the internal building temperature. These variables influence the energy input and output variables, modeled on the real energy usage statistics drawn from the SEC data provided by the building managers.
Resumo:
Lens average and equivalent refractive indices are required for purposes such as lens thickness estimation and optical modeling. We modeled the refractive index gradient as a power function of the normalized distance from lens center. Average index along the lens axis was estimated by integration. Equivalent index was estimated by raytracing through a model eye to establish ocular refraction, and then backward raytracing to determine the constant refractive index yielding the same refraction. Assuming center and edge indices remained constant with age, at 1.415 and 1.37 respectively, average axial refractive index increased (1.408 to 1.411) and equivalent index decreased (1.425 to 1.420) with age increase from 20 to 70 years. These values agree well with experimental estimates based on different techniques, although the latter show considerable scatter. The simple model of index gradient gives reasonable estimates of average and equivalent lens indices, although refinements in modeling and measurements are required.
Resumo:
Tumour angiogenesis is an important factor for tumour growth and metastasis. Although some recent reports suggest that microvessel counts in non-small cell lung cancer are related to a poor disease outcome, the results were not conclusive and were not compared with other molecular prognostic markers. In the present study, the vascular grade was assessed in 107 (T1,2-N0,1) operable non-small cell lung carcinomas, using the JC70 monoclonal antibody to CD31. Three vascular grades were defined with appraisal by eye and by Chalkley counting: high (Chalkley score 7-12), medium (5-6), and low (2-4). There was a significant correlation between eye appraisal and Chalkley counting (P < 0.0001). Vascular grade was not related to histology, grade, proliferation index (Ki67), or EGFR or p53 expression. Tumours from younger patients had a higher grade of angiogenesis (P = 0.05). Apart from the vascular grade, none of the other factors examined was statistically related to lymph node metastasis (P < 0.0001). A univariate analysis of survival showed that vascular grade was the most significant prognostic factor (P = 0.0004), followed by N-stage (P = 0.001). In a multivariate analysis, N-stage and vascular grade were not found to be independent prognostic factors, since they were strongly related to each other. Excluding N-stage, vascular grade was the only independent prognostic factor (P = 0.007). Kaplan-Meier survival curves showed a statistically significant worse prognosis for patients with high vascular grade, but no difference was observed between low and medium vascular grade. These data suggest that angiogenesis in operable non-small cell lung cancer is a major prognostic factor for survival and, among the parameters tested, is the only factor related to cancer cell migration to lymph nodes. The integration of vascular grading in clinical trials on adjuvant chemotherapy and/or radiotherapy could substantially contribute in defining groups of operable patients who might benefit from cytotoxic treatment.
Resumo:
This paper proposes that critical realism can provide a useful theoretical foundation to study enterprise architecture (EA) evolution. Specifically it will investigate the practically relevant and academically challenging question of how EAs integrate the Service-oriented Architecture (SOA). Archer’s Morphogenetic theory is used as an analytical approach to distinguish the architectural conditions under which SOA is introduced, to study the relationships between these conditions and SOA introduction, and to reflect on EA evolution (elaborations) that then take place. The focus lies on the reasons why EA evolution takes place (or not) and what architectural changes happen. This paper uses the findings of a literature review to build an a-priori model informed by Archer’s theory to understand EA evolution in a field that often lacks a solid theoretical groundwork. The findings are threefold. First, EA can evolve on different levels (different integration outcomes). Second, the integration outcomes are classified into three levels: business architecture, information systems architecture and technology architecture. Third, the analytical separation using Archer’s theory is helpful in order to understand how these different integration outcomes are generated.
Resumo:
Earthwork planning has been considered in this article and a generic block partitioning and modelling approach has been devised to provide strategic plans of various levels of detail. Conceptually this approach is more accurate and comprehensive than others, for instance those that are section based. In response to environmental concerns the metric for decision making was fuel consumption and emissions. Haulage distance and gradient are also included as they are important components of these metrics. Advantageously the fuel consumption metric is generic and captures the physical difficulties of travelling over inclines of different gradients, that is consistent across all hauling vehicles. For validation, the proposed models and techniques have been applied to a real world road project. The numerical investigations have demonstrated that the models can be solved with relatively little CPU time. The proposed block models also result in solutions of superior quality, i.e. they have reduced fuel consumption and cost. Furthermore the plans differ considerably from those based solely upon a distance based metric thus demonstrating a need for industry to reflect upon their current practices.
Resumo:
Using cooperative learning in classrooms promotes academic achievement, communication skills, problem-solving, social skills and student motivation. Yet it is reported that cooperative learning as a Western educational concept may be ineffective in Asian cultural contexts. The study aims to investigate the utilisation of scaffolding techniques for cooperative learning in Thailand primary mathematics classes. A teacher training program was designed to foster Thai primary school teachers’ cooperative learning implementation. Two teachers participated in this experimental program for one and a half weeks and then implemented cooperative learning strategies in their mathematics classes for six weeks. The data collected from teacher interviews and classroom observations indicates that the difficulty or failure of implementing cooperative learning in Thailand education may not be directly derived from cultural differences. Instead, it does indicate that Thai culture can be constructively merged with cooperative learning through a teacher training program and practices of scaffolding techniques.
Resumo:
The Thailand education reform adopted cooperative learning to improve the quality of education. However, it has been reported that the introduction and maintenance of cooperative learning has been difficult and uncertain because of the cultural differences. The study proposed a conceptual framework developed based on making a connection between Thai cultures and cooperative learning elements, and implemented a small-scale research project in a Thai primary mathematics class with a teacher and thirty-two Grade 4 students. The results uncovered that the three components including preparation of teachers, instructional strategies and preparation of students can be vehicles for the culture integration in cooperative learning.
Resumo:
In 2012, Queensland University of Technology (QUT) committed to the massive project of revitalizing its Bachelor of Science (ST01) degree. Like most universities in Australia, QUT has begun work to align all courses by 2015 to the requirements of the updated Australian Qualifications Framework (AQF) which is regulated by the Tertiary Education Quality and Standards Agency (TEQSA). From the very start of the redesigned degree program, students approach scientific study with an exciting mix of theory and highly topical real world examples through their chosen “grand challenge.” These challenges, Fukushima and nuclear energy for example, are the lenses used to explore science and lead to 21st century learning outcomes for students. For the teaching and learning support staff, our grand challenge is to expose all science students to multidisciplinary content with a strong emphasis on embedding information literacies into the curriculum. With ST01, QUT is taking the initiative to rethink not only content but how units are delivered and even how we work together between the faculty, the library and learning and teaching support. This was the desired outcome but as we move from design to implementation, has this goal been achieved? A main component of the new degree is to ensure scaffolding of information literacy skills throughout the entirety of the three year course. However, with the strong focus on problem-based learning and group work skills, many issues arise both for students and lecturers. A move away from a traditional lecture style is necessary but impacts on academics’ workload and comfort levels. Therefore, academics in collaboration with librarians and other learning support staff must draw on each others’ expertise to work together to ensure pedagogy, assessments and targeted classroom activities are mapped within and between units. This partnership can counteract the tendency of isolated, unsupported academics to concentrate on day-to-day teaching at the expense of consistency between units and big picture objectives. Support staff may have a more holistic view of a course or degree than coordinators of individual units, making communication and truly collaborative planning even more critical. As well, due to staffing time pressures, design and delivery of new curriculum is generally done quickly with no option for the designers to stop and reflect on the experience and outcomes. It is vital we take this unique opportunity to closely examine what QUT has and hasn’t achieved to be able to recommend a better way forward. This presentation will discuss these important issues and stumbling blocks, to provide a set of best practice guidelines for QUT and other institutions. The aim is to help improve collaboration within the university, as well as to maximize students’ ability to put information literacy skills into action. As our students embark on their own grand challenges, we must challenge ourselves to honestly assess our own work.
Resumo:
The movement of molecules inside living cells is a fundamental feature of biological processes. The ability to both observe and analyse the details of molecular diffusion in vivo at the single-molecule and single-cell level can add significant insight into understanding molecular architectures of diffus- ing molecules and the nanoscale environment in which the molecules diffuse. The tool of choice for monitoring dynamic molecular localization in live cells is fluorescence microscopy, especially so combining total internal reflection fluorescence with the use of fluorescent protein (FP) reporters in offering exceptional imaging contrast for dynamic processes in the cell mem- brane under relatively physiological conditions compared with competing single-molecule techniques. There exist several different complex modes of diffusion, and discriminating these from each other is challenging at the mol- ecular level owing to underlying stochastic behaviour. Analysis is traditionally performed using mean square displacements of tracked particles; however, this generally requires more data points than is typical for single FP tracks owing to photophysical instability. Presented here is a novel approach allowing robust Bayesian ranking of diffusion processes to dis-criminate multiple complex modes probabilistically. It is a computational approach that biologists can use to understand single-molecule features in live cells.
Resumo:
Plants transformed with Agrobacterium frequently contain T-DNA concatamers with direct-repeat (d/r) or inverted-repeat (i/r) transgene integrations, and these repetitive T-DNA insertions are often associated with transgene silencing. To facilitate the selection of transgenic lines with simple T-DNA insertions, we constructed a binary vector (pSIV) based on the principle of hairpin RNA (hpRNA)-induced gene silencing. The vector is designed so that any transformed cells that contain more than one insertion per locus should generate hpRNA against the selective marker gene, leading to its silencing. These cells should, therefore, be sensitive to the selective agent and less likely to regenerate. Results from Arabidopsis and tobacco transformation showed that pSIV gave considerably fewer transgenic lines with repetitive insertions than did a conventional T-DNA vector (pCON). Furthermore, the transgene was more stably expressed in the pSIV plants than in the pCON plants. Rescue of plant DNA flanking sequences from pSIV plants was significantly more frequent than from pCON plants, suggesting that pSIV is potentially useful for T-DNA tagging. Our results revealed a perfect correlation between the presence of tail-to-tail inverted repeats and transgene silencing, supporting the view that read-through hpRNA transcript derived from i/r T-DNA insertions is a primary inducer of transgene silencing in plants. © CSIRO 2005.