988 resultados para mathematical parameters


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper focuses on the turning point experiences that worked to transform the researcher during a preliminary consultation process to seek permission to conduct of a small pilot project on one Torres Strait Island. The project aimed to learn from parents how they support their children in their mathematics learning. Drawing on a community research design, a consultative meeting was held with one Torres Strait Islander community to discuss the possibility of piloting a small project that focused on working with parents and children to learn about early mathematics processes. Preliminary data indicated that parents use networks in their community. It highlighted the funds of knowledge of mathematics that exist in the community and which are used to teach their children. Such knowledges are situated within a community’s unique histories, culture and the voices of the people. “Omei” tree means the Tree of Wisdom in the Island community.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Changing sodium intake from 70-200 mmol/day elevates blood pressure in normotensive volunteers by 6/4 mmHg. Older people, people with reduced renal function on a low sodium diet and people with a family history of hypertension are more likely to show this effect. The rise in blood pressure was associated with a fall in plasma volume suggesting that plasma volume changes do not initiate hypertension. In normotensive individuals the most common abnormality in membrane sodium transport induced by an extra sodium load was an increased permeability of the red cell to sodium. Some normotensive individuals also had an increase in the level of a plasma inhibitor that inhibited Na-K ATPase. These individuals also appeared to have a rise in blood pressure. Sodium intake and blood pressure are related. The relationship differs in different people and is probably controlled by the genetically inherited capacity of systems involved in membrane sodium transport.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A total histological grade does not necessarily distinguish between different manifestations of cartilage damage or degeneration. An accurate and reliable histological assessment method is required to separate normal and pathological tissue within a joint during treatment of degenerative joint conditions and to sub-classify the latter in meaningful ways. The Modified Mankin method may be adaptable for this purpose. We investigated how much detail may be lost by assigning one composite score/grade to represent different degenerative components of the osteoarthritic condition. We used four ovine injury models (sham surgery, anterior cruciate ligament/medial collateral ligament instability, simulated anatomic anterior cruciate ligament reconstruction and meniscal removal) to induce different degrees and potentially 'types' (mechanisms) of osteoarthritis. Articular cartilage was systematically harvested, prepared for histological examination and graded in a blinded fashion using a Modified Mankin grading method. Results showed that the possible permutations of cartilage damage were significant and far more varied than the current intended use that histological grading systems allow. Of 1352 cartilage specimens graded, 234 different manifestations of potential histological damage were observed across 23 potential individual grades of the Modified Mankin grading method. The results presented here show that current composite histological grading may contain additional information that could potentially discern different stages or mechanisms of cartilage damage and degeneration in a sheep model. This approach may be applicable to other grading systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A number of mathematical models investigating certain aspects of the complicated process of wound healing are reported in the literature in recent years. However, effective numerical methods and supporting error analysis for the fractional equations which describe the process of wound healing are still limited. In this paper, we consider numerical simulation of fractional model based on the coupled advection-diffusion equations for cell and chemical concentration in a polar coordinate system. The space fractional derivatives are defined in the Left and Right Riemann-Liouville sense. Fractional orders in advection and diffusion terms belong to the intervals (0; 1) or (1; 2], respectively. Some numerical techniques will be used. Firstly, the coupled advection-diffusion equations are decoupled to a single space fractional advection-diffusion equation in a polar coordinate system. Secondly, we propose a new implicit difference method for simulating this equation by using the equivalent of the Riemann-Liouville and Gr¨unwald-Letnikov fractional derivative definitions. Thirdly, its stability and convergence are discussed, respectively. Finally, some numerical results are given to demonstrate the theoretical analysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a formalism for the analysis of sensitivity of nuclear magnetic resonance pulse sequences to variations of pulse sequence parameters, such as radiofrequency pulses, gradient pulses or evolution delays. The formalism enables the calculation of compact, analytic expressions for the derivatives of the density matrix and the observed signal with respect to the parameters varied. The analysis is based on two constructs computed in the course of modified density-matrix simulations: the error interrogation operators and error commutators. The approach presented is consequently named the Error Commutator Formalism (ECF). It is used to evaluate the sensitivity of the density matrix to parameter variation based on the simulations carried out for the ideal parameters, obviating the need for finite-difference calculations of signal errors. The ECF analysis therefore carries a computational cost comparable to a single density-matrix or product-operator simulation. Its application is illustrated using a number of examples from basic NMR spectroscopy. We show that the strength of the ECF is its ability to provide analytic insights into the propagation of errors through pulse sequences and the behaviour of signal errors under phase cycling. Furthermore, the approach is algorithmic and easily amenable to implementation in the form of a programming code. It is envisaged that it could be incorporated into standard NMR product-operator simulation packages.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Rapid urbanisation and resulting continuous increase in traffic has been recognised as key factors in the contribution of increased pollutant loads to urban stormwater and in turn to receiving waters. Urbanisation primarily increases anthropogenic activities and the percentage of impervious surfaces in urban areas. These processes are collectively responsible for urban stormwater pollution. In this regard, urban traffic and land use related activities have been recognised as the primary pollutant sources. This is primarily due to the generation of a range of key pollutants such as solids, heavy metals and PAHs. Appropriate treatment system design is the most viable approach to mitigate stormwater pollution. However, limited understanding of the pollutant process and transport pathways constrains effective treatment design. This highlights necessity for the detailed understanding of traffic and other land use related pollutants processes and pathways in relation to urban stormwater pollution. This study has created new knowledge in relation to pollutant processes and transport pathways encompassing atmospheric pollutants, atmospheric deposition and build-up on ground surfaces of traffic generated key pollutants. The research study was primarily based on in-depth experimental investigations. This thesis describes the extensive knowledge created relating to the processes of atmospheric pollutant build-up, atmospheric deposition and road surface build-up and establishing their relationships as a chain of processes. The analysis of atmospheric deposition revealed that both traffic and land use related sources contribute total suspended particulate matter (TSP) to the atmosphere. Traffic sources become dominant during weekdays whereas land use related sources become dominant during weekends due to the reduction in traffic sources. The analysis further concluded that atmospheric TSP, polycyclic aromatic hydrocarbons (PAHs) and heavy metals (HMs) concentrations are highly influenced by total average daily heavy duty traffic, traffic congestion and the fraction of commercial and industrial land uses. A set of mathematical equation were developed to predict TSP, PAHs and HMs concentrations in the atmosphere based on the influential traffic and land use related parameters. Dry deposition samples were collected for different antecedent dry days and wet deposition samples were collected immediately after rainfall events. The dry deposition was found to increase with the antecedent dry days and consisted of relatively coarser particles (greater than 1.4 ìm) when compared to wet deposition. The wet deposition showed a strong affinity to rainfall depth, but was not related to the antecedent dry period. It was also found that smaller size particles (less than 1.4 ìm) travel much longer distances from the source and deposit mainly with the wet deposition. Pollutants in wet deposition are less sensitive to the source characteristics compared to dry deposition. Atmospheric deposition of HMs is not directly influenced by land use but rather by proximity to high emission sources such as highways. Therefore, it is important to consider atmospheric deposition as a key pollutant source to urban stormwater in the vicinity of these types of sources. Build-up was analysed for five different particle size fractions, namely, <1 ìm, 1-75 ìm, 75-150 ìm, 150-300 ìm and >300 ìm for solids, PAHs and HMs. The outcomes of the study indicated that PAHs and HMs in the <75 ìm size fraction are generated mainly by traffic related activities whereas the > 150 ìm size fraction is generated by both traffic and land use related sources. Atmospheric deposition is an important source for HMs build-up on roads, whereas the contribution of PAHs from atmospheric sources is limited. A comprehensive approach was developed to predict traffic and other land use related pollutants in urban stormwater based on traffic and other land use characteristics. This approach primarily included the development of a set of mathematical equations to predict traffic generated pollutants by linking traffic and land use characteristics to stormwater quality through mathematical modelling. The outcomes of this research will contribute to the design of appropriate treatment systems to safeguard urban receiving water quality for future traffic growth scenarios. The „real world. application of knowledge generated was demonstrated through mathematical modelling of solids in urban stormwater, accounting for the variability in traffic and land use characteristics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Quality oriented management systems and methods have become the dominant business and governance paradigm. From this perspective, satisfying customers’ expectations by supplying reliable, good quality products and services is the key factor for an organization and even government. During recent decades, Statistical Quality Control (SQC) methods have been developed as the technical core of quality management and continuous improvement philosophy and now are being applied widely to improve the quality of products and services in industrial and business sectors. Recently SQC tools, in particular quality control charts, have been used in healthcare surveillance. In some cases, these tools have been modified and developed to better suit the health sector characteristics and needs. It seems that some of the work in the healthcare area has evolved independently of the development of industrial statistical process control methods. Therefore analysing and comparing paradigms and the characteristics of quality control charts and techniques across the different sectors presents some opportunities for transferring knowledge and future development in each sectors. Meanwhile considering capabilities of Bayesian approach particularly Bayesian hierarchical models and computational techniques in which all uncertainty are expressed as a structure of probability, facilitates decision making and cost-effectiveness analyses. Therefore, this research investigates the use of quality improvement cycle in a health vii setting using clinical data from a hospital. The need of clinical data for monitoring purposes is investigated in two aspects. A framework and appropriate tools from the industrial context are proposed and applied to evaluate and improve data quality in available datasets and data flow; then a data capturing algorithm using Bayesian decision making methods is developed to determine economical sample size for statistical analyses within the quality improvement cycle. Following ensuring clinical data quality, some characteristics of control charts in the health context including the necessity of monitoring attribute data and correlated quality characteristics are considered. To this end, multivariate control charts from an industrial context are adapted to monitor radiation delivered to patients undergoing diagnostic coronary angiogram and various risk-adjusted control charts are constructed and investigated in monitoring binary outcomes of clinical interventions as well as postintervention survival time. Meanwhile, adoption of a Bayesian approach is proposed as a new framework in estimation of change point following control chart’s signal. This estimate aims to facilitate root causes efforts in quality improvement cycle since it cuts the search for the potential causes of detected changes to a tighter time-frame prior to the signal. This approach enables us to obtain highly informative estimates for change point parameters since probability distribution based results are obtained. Using Bayesian hierarchical models and Markov chain Monte Carlo computational methods, Bayesian estimators of the time and the magnitude of various change scenarios including step change, linear trend and multiple change in a Poisson process are developed and investigated. The benefits of change point investigation is revisited and promoted in monitoring hospital outcomes where the developed Bayesian estimator reports the true time of the shifts, compared to priori known causes, detected by control charts in monitoring rate of excess usage of blood products and major adverse events during and after cardiac surgery in a local hospital. The development of the Bayesian change point estimators are then followed in a healthcare surveillances for processes in which pre-intervention characteristics of patients are viii affecting the outcomes. In this setting, at first, the Bayesian estimator is extended to capture the patient mix, covariates, through risk models underlying risk-adjusted control charts. Variations of the estimator are developed to estimate the true time of step changes and linear trends in odds ratio of intensive care unit outcomes in a local hospital. Secondly, the Bayesian estimator is extended to identify the time of a shift in mean survival time after a clinical intervention which is being monitored by riskadjusted survival time control charts. In this context, the survival time after a clinical intervention is also affected by patient mix and the survival function is constructed using survival prediction model. The simulation study undertaken in each research component and obtained results highly recommend the developed Bayesian estimators as a strong alternative in change point estimation within quality improvement cycle in healthcare surveillances as well as industrial and business contexts. The superiority of the proposed Bayesian framework and estimators are enhanced when probability quantification, flexibility and generalizability of the developed model are also considered. The empirical results and simulations indicate that the Bayesian estimators are a strong alternative in change point estimation within quality improvement cycle in healthcare surveillances. The superiority of the proposed Bayesian framework and estimators are enhanced when probability quantification, flexibility and generalizability of the developed model are also considered. The advantages of the Bayesian approach seen in general context of quality control may also be extended in the industrial and business domains where quality monitoring was initially developed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Contemporary mathematics education attempts to instil within learners the conceptualization of mathematics as a highly organized and inter-connected set of ideas. To support this, a means to graphically represent this organization of ideas is presented which reflects the cognitive mechanisms that shape a learner’s understanding. This organisation of information may then be analysed, with the view to informing the design of mathematics instruction in face-to-face and/or computer-mediated learning environments. However, this analysis requires significant work to develop both theory and practice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We examine the solution of the two-dimensional Cahn-Hilliard-reaction (CHR) equation in the xy plane as a model of Li+ intercalation into LiFePO4 material. We validate our numerical solution against the solution of the depth-averaged equation, which has been used to model intercalation in the limit of highly orthotropic diffusivity and gradient penalty tensors. We then examine the phase-change behaviour in the full CHR system as these parameters become more isotropic, and find that as the Li+ diffusivity is increased in the x direction, phase separation persists at high currents, even in small crystals with averaged coherency strain included. The resulting voltage curves decrease monotonically, which has previously been considered a hallmark of crystals that fill homogeneously.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article presents a case study of corporate dialogue with vulnerable others. Dialogue with marginalized external groups is increasingly presented in the business literature as the key to making corporate social responsibility possible in particular through corporate learning. Corporate public communications at the same time promote community engagement as a core aspect of corporate social responsibility. This article examines the possibilities for and conditions underpinning corporate dialogue with marginalized stakeholders as occurred around the unexpected and sudden closure in January 2009 of the AU$2.2 billion BHP Billiton Ravensthorpe Nickel mine in rural Western Australia. In doing so we draw on John Roberts’ notion of dialogue with vulnerable others, and apply a discourse analysis approach to data spanning corporate public communications and interviews with residents affected by the decision to close the mine. In presenting this case study we contribute to the as yet limited organizational research concerned directly with marginalized stakeholders and argue that corporate social responsibility discourse and vulnerable other dialogue not only affirms the primacy of business interests but also co-opts vulnerable others in the pursuit of these interests. In conclusion we consider case study implications for critical understandings of corporate dialogue with vulnerable others.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we present a methodology for designing experiments for efficiently estimating the parameters of models with computationally intractable likelihoods. The approach combines a commonly used methodology for robust experimental design, based on Markov chain Monte Carlo sampling, with approximate Bayesian computation (ABC) to ensure that no likelihood evaluations are required. The utility function considered for precise parameter estimation is based upon the precision of the ABC posterior distribution, which we form efficiently via the ABC rejection algorithm based on pre-computed model simulations. Our focus is on stochastic models and, in particular, we investigate the methodology for Markov process models of epidemics and macroparasite population evolution. The macroparasite example involves a multivariate process and we assess the loss of information from not observing all variables.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mathematical English is a unique language based on ordinary English, with the addition of highly stylised formal symbol systems. Some words have a redefined status. Mathematical English has its own lexicon, syntax, semantics and literature. It is more difficult to understand than ordinary English. Ability in basic interpersonal communication does not necessarily result in proficiency in the use of mathematical English. The complex nature of mathematical English may impact upon the ability of students to succeed in mathematical and numeracy assessment. This article presents a review of the literature about the complexities of mathematical English. It includes examples of more than fifty language features that have been shown to add to the challenge of interpreting mathematical texts. Awareness of the complexities of mathematical English is an essential skill needed by mathematics teachers when teaching and when designing assessment tasks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many computationally intensive scientific applications involve repetitive floating point operations other than addition and multiplication which may present a significant performance bottleneck due to the relatively large latency or low throughput involved in executing such arithmetic primitives on commod- ity processors. A promising alternative is to execute such primitives on Field Programmable Gate Array (FPGA) hardware acting as an application-specific custom co-processor in a high performance reconfig- urable computing platform. The use of FPGAs can provide advantages such as fine-grain parallelism but issues relating to code development in a hardware description language and efficient data transfer to and from the FPGA chip can present significant application development challenges. In this paper, we discuss our practical experiences in developing a selection of floating point hardware designs to be implemented using FPGAs. Our designs include some basic mathemati cal library functions which can be implemented for user defined precisions suitable for novel applications requiring non-standard floating point represen- tation. We discuss the details of our designs along with results from performance and accuracy analysis tests.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Traffic safety studies demand more than what current micro-simulation models can provide as they presume that all drivers of motor vehicles exhibit safe behaviours. Several car-following models are used in various micro-simulation models. This research compares the mainstream car following models’ capabilities of emulating precise driver behaviour parameters such as headways and Time to Collisions. The comparison firstly illustrates which model is more robust in the metric reproduction. Secondly, the study conducted a series of sensitivity tests to further explore the behaviour of each model. Based on the outcome of these two steps exploration of the models, a modified structure and parameters adjustment for each car-following model is proposed to simulate more realistic vehicle movements, particularly headways and Time to Collision, below a certain critical threshold. NGSIM vehicle trajectory data is used to evaluate the modified models performance to assess critical safety events within traffic flow. The simulation tests outcomes indicate that the proposed modified models produce better frequency of critical Time to Collision than the generic models, while the improvement on the headway is not significant. The outcome of this paper facilitates traffic safety assessment using microscopic simulation.