923 resultados para Mathematical Cardiovascular Model
Resumo:
Conventional methods in horizontal drilling processes incorporate magnetic surveying techniques for determining the position and orientation of the bottom-hole assembly (BHA). Such means result in an increased weight of the drilling assembly, higher cost due to the use of non-magnetic collars necessary for the shielding of the magnetometers, and significant errors in the position of the drilling bit. A fiber-optic gyroscope (FOG) based inertial navigation system (INS) has been proposed as an alternative to magnetometer -based downhole surveying. The utilizing of a tactical-grade FOG based surveying system in the harsh downhole environment has been shown to be theoretically feasible, yielding a significant BHA position error reduction (less than 100m over a 2-h experiment). To limit the growing errors of the INS, an in-drilling alignment (IDA) method for the INS has been proposed. This article aims at describing a simple, pneumatics-based design of the IDA apparatus and its implementation downhole. A mathematical model of the setup is developed and tested with Bloodshed Dev-C++. The simulations demonstrate a simple, low cost and feasible IDA apparatus.
Resumo:
Research and development of mathematical model of optimum distribution of resources (basically financial) for maintenance of the new (raised) quality (reliability) of complex system concerning, which the decision on its re-structuring is accepted, is stated. The final model gives answers (algorithm of calculation) to questions: how many elements of system to allocate on modernization, which elements, up to what level of depth modernization of each of allocated is necessary, and optimum answers are by criterion of minimization of financial charges.
Resumo:
Миглена Г. Кирилова-Донева - Едномерен експеримент на релаксация беше извършен с 14 образци от човешка пъпна фасция. Механичното поведение на фасцията по време на релаксация беше моделирано прилагайки нелинейната теория на Максвел-Гуревич-Рабинович. Параметрите на модела за изследваните образци бяха определени и стойностите им бяха сравнени в зависимост от посоката на натоварване на образците по време на експеримента. Установено бе, че стойностите на началния вискозитет ∗η0 и на параметъра ∗m, който се влияе от скоростта на деформация на материала се изменят в много широки граници не само за образци от различни донори, но и за образци от един донор. В резултат от прилагането на модела бе изчислено изменението на вискозитета и вискозната деформация на материала по време на релаксацията. Бе показано, че изменението на вискозитета и вискозната деформация зависи от посоката на натоварване на образците.
Resumo:
The cell:cell bond between an immune cell and an antigen presenting cell is a necessary event in the activation of the adaptive immune response. At the juncture between the cells, cell surface molecules on the opposing cells form non-covalent bonds and a distinct patterning is observed that is termed the immunological synapse. An important binding molecule in the synapse is the T-cell receptor (TCR), that is responsible for antigen recognition through its binding with a major-histocompatibility complex with bound peptide (pMHC). This bond leads to intracellular signalling events that culminate in the activation of the T-cell, and ultimately leads to the expression of the immune eector function. The temporal analysis of the TCR bonds during the formation of the immunological synapse presents a problem to biologists, due to the spatio-temporal scales (nanometers and picoseconds) that compare with experimental uncertainty limits. In this study, a linear stochastic model, derived from a nonlinear model of the synapse, is used to analyse the temporal dynamics of the bond attachments for the TCR. Mathematical analysis and numerical methods are employed to analyse the qualitative dynamics of the nonequilibrium membrane dynamics, with the specic aim of calculating the average persistence time for the TCR:pMHC bond. A single-threshold method, that has been previously used to successfully calculate the TCR:pMHC contact path sizes in the synapse, is applied to produce results for the average contact times of the TCR:pMHC bonds. This method is extended through the development of a two-threshold method, that produces results suggesting the average time persistence for the TCR:pMHC bond is in the order of 2-4 seconds, values that agree with experimental evidence for TCR signalling. The study reveals two distinct scaling regimes in the time persistent survival probability density prole of these bonds, one dominated by thermal uctuations and the other associated with the TCR signalling. Analysis of the thermal fluctuation regime reveals a minimal contribution to the average time persistence calculation, that has an important biological implication when comparing the probabilistic models to experimental evidence. In cases where only a few statistics can be gathered from experimental conditions, the results are unlikely to match the probabilistic predictions. The results also identify a rescaling relationship between the thermal noise and the bond length, suggesting a recalibration of the experimental conditions, to adhere to this scaling relationship, will enable biologists to identify the start of the signalling regime for previously unobserved receptor:ligand bonds. Also, the regime associated with TCR signalling exhibits a universal decay rate for the persistence probability, that is independent of the bond length.
Resumo:
The authors screened 34 large cattle herds for the presence of Mycoplasma bovis infection by examining slaughtered cattle for macroscopic lung lesions, by culturing M. bovis from lung lesions and at the same time by testing sera for the presence of antibodies against M. bovis. Among the 595 cattle examined, 33.9% had pneumonic lesions, mycoplasmas were isolated from 59.9% of pneumonic lung samples, and 10.9% of sera from those animals contained antibodies to M.bovis. In 25.2% of the cases M. bovis was isolated from lungs with no macroscopic lesions. The proportion of seropositive herds was 64.7%. The average seropositivity rate of individuals was 11.3% but in certain herds it exceeded 50%. A probability model was developed for examining the relationship among the occurrence of pneumonia, the isolation of M. bovis from the lungs and the presence of M. bovis specific antibodies in sera.
Resumo:
The objective of this study was to develop a model to predict transport and fate of gasoline components of environmental concern in the Miami River by mathematically simulating the movement of dissolved benzene, toluene, xylene (BTX), and methyl-tertiary-butyl ether (MTBE) occurring from minor gasoline spills in the inter-tidal zone of the river. Computer codes were based on mathematical algorithms that acknowledge the role of advective and dispersive physical phenomena along the river and prevailing phase transformations of BTX and MTBE. Phase transformations included volatilization and settling. ^ The model used a finite-difference scheme of steady-state conditions, with a set of numerical equations that was solved by two numerical methods: Gauss-Seidel and Jacobi iterations. A numerical validation process was conducted by comparing the results from both methods with analytical and numerical reference solutions. Since similar trends were achieved after the numerical validation process, it was concluded that the computer codes algorithmically were correct. The Gauss-Seidel iteration yielded at a faster convergence rate than the Jacobi iteration. Hence, the mathematical code was selected to further develop the computer program and software. The model was then analyzed for its sensitivity. It was found that the model was very sensitive to wind speed but not to sediment settling velocity. ^ A computer software was developed with the model code embedded. The software was provided with two major user-friendly visualized forms, one to interface with the database files and the other to execute and present the graphical and tabulated results. For all predicted concentrations of BTX and MTBE, the maximum concentrations were over an order of magnitude lower than current drinking water standards. It should be pointed out, however, that smaller concentrations than the latter reported standards and values, although not harmful to humans, may be very harmful to organisms of the trophic levels of the Miami River ecosystem and associated waters. This computer model can be used for the rapid assessment and management of the effects of minor gasoline spills on inter-tidal riverine water quality. ^
Resumo:
The development of a new set of frost property measurement techniques to be used in the control of frost growth and defrosting processes in refrigeration systems was investigated. Holographic interferometry and infrared thermometry were used to measure the temperature of the frost-air interface, while a beam element load sensor was used to obtain the weight of a deposited frost layer. The proposed measurement techniques were tested for the cases of natural and forced convection, and the characteristic charts were obtained for a set of operational conditions. ^ An improvement of existing frost growth mathematical models was also investigated. The early stage of frost nucleation was commonly not considered in these models and instead an initial value of layer thickness and porosity was regularly assumed. A nucleation model to obtain the droplet diameter and surface porosity at the end of the early frosting period was developed. The drop-wise early condensation in a cold flat plate under natural convection to a hot (room temperature) and humid air was modeled. A nucleation rate was found, and the relation of heat to mass transfer (Lewis number) was obtained. It was found that the Lewis number was much smaller than unity, which is the standard value usually assumed for most frosting numerical models. The nucleation model was validated against available experimental data for the early nucleation and full growth stages of the frosting process. ^ The combination of frost top temperature and weight variation signals can now be used to control the defrosting timing and the developed early nucleation model can now be used to simulate the entire process of frost growth in any surface material. ^
Resumo:
To predict the maneuvering performance of a propelled SPAR vessel, a mathematical model was established as a path simulator. A system-based mathematical model was chosen as it offers advantages in cost and time over full Computational Fluid Dynamics (CFD) simulations. The model is intended to provide a means of optimizing the maneuvering performance of this new vessel type. In this study the hydrodynamic forces and control forces are investigated as individual components, combined in a vectorial setting, and transferred to a body-fixed basis. SPAR vessels are known to be very sensitive to large amplitude motions during maneuvers due to the relatively small hydrostatic restoring forces. Previous model tests of SPAR vessels have shown significant roll and pitch amplitudes, especially during course change maneuvers. Thus, a full 6 DOF equation of motion was employed in the current numerical model. The mathematical model employed in this study was a combination of the model introduced by the Maneuvering Modeling Group (MMG) and the Abkowitz (1964) model. The new model represents the forces applied to the ship hull, the propeller forces and the rudder forces independently, as proposed by the MMG, but uses a 6DOF equation of motion introduced by Abkowitz to describe the motion of a maneuvering ship. The mathematical model was used to simulate the trajectory and motions of the propelled SPAR vessel in 10˚/10˚, 20˚/20˚ and 30˚/30˚ standard zig-zag maneuvers, as well as turning circle tests at rudder angles of 20˚ and 30˚. The simulation results were used to determine the maneuverability parameters (e.g. advance, transfer and tactical diameter) of the vessel. The final model provides the means of predicting and assessing the performance of the vessel type and can be easily adapted to specific vessel configurations based on the generic SPAR-type vessel used in this study.
Resumo:
Conventional reliability models for parallel systems are not applicable for the analysis of parallel systems with load transfer and sharing. In this short communication, firstly, the dependent failures of parallel systems are analyzed, and the reliability model of load-sharing parallel system is presented based on Miner cumulative damage theory and the full probability formula. Secondly, the parallel system reliability is calculated by Monte Carlo simulation when the component life follows the Weibull distribution. The research result shows that the proposed reliability mathematical model could analyze and evaluate the reliability of parallel systems in the presence of load transfer.
Resumo:
The viscosity of ionic liquids (ILs) has been modeled as a function of temperature and at atmospheric pressure using a new method based on the UNIFAC–VISCO method. This model extends the calculations previously reported by our group (see Zhao et al. J. Chem. Eng. Data 2016, 61, 2160–2169) which used 154 experimental viscosity data points of 25 ionic liquids for regression of a set of binary interaction parameters and ion Vogel–Fulcher–Tammann (VFT) parameters. Discrepancies in the experimental data of the same IL affect the quality of the correlation and thus the development of the predictive method. In this work, mathematical gnostics was used to analyze the experimental data from different sources and recommend one set of reliable data for each IL. These recommended data (totally 819 data points) for 70 ILs were correlated using this model to obtain an extended set of binary interaction parameters and ion VFT parameters, with a regression accuracy of 1.4%. In addition, 966 experimental viscosity data points for 11 binary mixtures of ILs were collected from literature to establish this model. All the binary data consist of 128 training data points used for the optimization of binary interaction parameters and 838 test data points used for the comparison of the pure evaluated values. The relative average absolute deviation (RAAD) for training and test is 2.9% and 3.9%, respectively.
Resumo:
Nowadays, risks arising from the rapid development of oil and gas industries are significantly increasing. As a result, one of the main concerns of either industrial or environmental managers is the identification and assessment of such risks in order to develop and maintain appropriate proactive measures. Oil spill from stationary sources in offshore zones is one of the accidents resulting in several adverse impacts on marine ecosystems. Considering a site's current situation and relevant requirements and standards, risk assessment process is not only capable of recognizing the probable causes of accidents but also of estimating the probability of occurrence and the severity of consequences. In this way, results of risk assessment would help managers and decision makers create and employ proper control methods. Most of the represented models for risk assessment of oil spills are achieved on the basis of accurate data bases and analysis of historical data, but unfortunately such data bases are not accessible in most of the zones, especially in developing countries, or else they are newly established and not applicable yet. This issue reveals the necessity of using Expert Systems and Fuzzy Set Theory. By using such systems it will be possible to formulize the specialty and experience of several experts and specialists who have been working in petroliferous areas for several years. On the other hand, in developing countries often the damages to environment and environmental resources are not considered as risk assessment priorities and they are approximately under-estimated. For this reason, the proposed model in this research is specially addressing the environmental risk of oil spills from stationary sources in offshore zones.
Resumo:
There is clear evidence that in typically developing children reasoning and sense-making are essential in all mathematical learning and understanding processes. In children with autism spectrum disorders (ASD), however, these become much more significant, considering their importance to successful independent living. This paper presents a preliminary proposal of a digital environment, specifically targeted to promote the development of mathematical reasoning in students with ASD. Given the diversity of ASD, the prototyping of this environment requires the study of dynamic adaptation processes and the development of activities adjusted to each user’s profile. We present the results obtained during the first phase of this ongoing research, describing a conceptual model of the proposed digital environment. Guidelines for future research are also discussed.
Resumo:
Adaptability and invisibility are hallmarks of modern terrorism, and keeping pace with its dynamic nature presents a serious challenge for societies throughout the world. Innovations in computer science have incorporated applied mathematics to develop a wide array of predictive models to support the variety of approaches to counterterrorism. Predictive models are usually designed to forecast the location of attacks. Although this may protect individual structures or locations, it does not reduce the threat—it merely changes the target. While predictive models dedicated to events or social relationships receive much attention where the mathematical and social science communities intersect, models dedicated to terrorist locations such as safe-houses (rather than their targets or training sites) are rare and possibly nonexistent. At the time of this research, there were no publically available models designed to predict locations where violent extremists are likely to reside. This research uses France as a case study to present a complex systems model that incorporates multiple quantitative, qualitative and geospatial variables that differ in terms of scale, weight, and type. Though many of these variables are recognized by specialists in security studies, there remains controversy with respect to their relative importance, degree of interaction, and interdependence. Additionally, some of the variables proposed in this research are not generally recognized as drivers, yet they warrant examination based on their potential role within a complex system. This research tested multiple regression models and determined that geographically-weighted regression analysis produced the most accurate result to accommodate non-stationary coefficient behavior, demonstrating that geographic variables are critical to understanding and predicting the phenomenon of terrorism. This dissertation presents a flexible prototypical model that can be refined and applied to other regions to inform stakeholders such as policy-makers and law enforcement in their efforts to improve national security and enhance quality-of-life.