942 resultados para Linear Static Analysis


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background: Monocytes are implicated in the initiation and progression of theatherosclerotic plaque contributing to plaque instability and rupture. Little is knownof the role played by the 3 phenotypically and functionally different monocytesubpopulations in determining ventricular remodeling following ST elevation my-ocardial infarction (STEMI). Mon1 are "classical" inflammatory monocytes, whilstMon3 are considered reparative with fibroblast deposition ability. The function ofthe newly described Mon2 is yet to be elucidated. Method: STEMI patients (n=196, mean age 62±13 years; 72% male) treatedwith percutaneous revascularization were recruited within the first 24 hours. Pe-ripheral blood monocyte subpopulations were enumerated and characterizedusing flow cytometry after staining for CD14, CD16 and CCR2. Phenotypi-cally, monocyte subpopulations are defined as: CD14+CD16-CCR2+ (Mon1),CD14+CD16+CCR+ (Mon2) and CD14lowCD16+CCR2- (Mon3) cells. Transtho-racic 2D echocardiography was performed within 7 days and 6 months post infarctto assess ventricular volumes, mass, systolic, and diastolic functions. Results: Using linear regression analysis higher counts for Mon1, and lowercounts for Mon2 and Mon3 were significantly associated with the baseline leftventricular ejection fraction (LVEF) within seven days post infarction. At 6 monthspost STEMI lower counts of Mon2 remained positively associated with decreasedLVEF (p value= 0.002).Monocyte subsets correlation with LVEFMonocytes mean florescence Baseline left ventricular Left ventricular ejectionintensity (cells/μl) ejection fraction (%) fraction (%) at 6 months post infarctβ-value P-valueβ-value P-valueTotal Mon0.31 P<0.001 0.360.009Mon 10.019 0.020.070.62Mon 2−0.28 0.001 −0.420.002Mon 3−0.27 0.001 −0.180.21 Conclusion: Peripheral monocytes of all three subsets correlate with LVEF af-ter a myocardial infarction. High counts of the inflammatory Mon1 are associatedwith reduction in the baseline LVEF. Post remodelling, the convalescent EF wasindependently predicted by monocyte subpopulation 2. As lower counts depictednegative ventricular remodeling, this suggests a reparative role for the newly de-scribed Mon2, possibly via myofibroblast deposition and angiogenesis, in contrastto an anticipated inflammatory role.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The transition of laterally heated flows in a vertical layer and in the presence of a streamwise pressure gradient is examined numerically for the case of different values Prandtl number. The stability analysis of the basic flow for the pure hydrodynamic case ( Pr = 0 ) was reported in [1]. We find that in the absence of transverse pumping the previously known critical parameters are recovered [2], while as the strength of the Poiseuille flow component is increased the convective motion is delayed considerably. Following the linear stability analysis for the vertical channel flow our attention is focused on a study of the finite am- plitude secondary travelling-wave (TW) solutions that develop from the perturbations of the transverse roll type imposed on the basic flow and temperature profiles. The linear stability of the secondary TWs against three-dimensional perturbations is also examined and it is shown that the bifurcating tertiary flows are phase-locked to the secondary TWs.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Kutatásunk alapvetése, hogy egy ország versenyképessége az értékteremtő munkamegosztást támogató teljes közösségi intézményrendszer sikeres működésén múlik. Munkánkkal arra kerestük a választ, milyen értékek és motivációk alakítják a magyar gazdaság intézményrendszerét. Nem a hivatalos magatartási szabályok statikus elemzésére koncentráltunk, hanem a normák, konvenciók és innovációk világára, az intézményrendszer jövőjét befolyásoló dinamikus elemekre. Elemzésünk fókuszában a társadalmi és vállalkozói értékek, a gazdaságpolitika formálók versenyképességi narratívái, a helyi gazdaságok versenyképességi tényezői, a versenyképesség javítását szolgáló magánkezdeményezések és a nonprofit szektor működése álltak. Fő eredményünk, hogy a Magyarország jövőbeli versenyképességét befolyásoló tudati elemek - a gazdasági döntéshozók motivációi és normái – megfelelő alapot teremtenek a gazdaságunk versenyképességét megerősítő üzleti, civil és kormányzati kezdeményezések számára. Magas közösségi és morális elvárások jellemzik a lakosság és a vállalkozók értékrendjét. A gazdaságpolitika-alkotók nyitottak az intézményi problémákra, a magyar véleményformálók körében egyetértés van a fő versenyképességi kihívásokat illetően. Jól azonosíthatóak a szervezők erőfeszítéseit kompenzálni képes versenyképességi összefogások keretei. A helyi gazdaságfejlesztés intézményei alakulóban vannak. A nonprofit szektor működési viszonyainak bizonytalansága ellenére a közcélúság és a versenyképesség közös területein (mint az atipikus foglalkoztatás) jól teljesít. Ezek az eredmények egyszerre nyitnak perspektívát a tudományos vizsgálódás és a gyakorlati cselekvés számára. Az önérdek és a közösségi értékteremtő képesség javításának motivációja közötti kapcsolat tudományos vizsgálata, a társadalmi innovációk kutatása a versenyképesség javíthatóságának kereteit tárhatja fel. Az üzleti, civil vagy kormányzati szereplők pedig akkor tudják a fogyasztói, közösségi elvárásokat sikeresen összeegyeztetni stratégiai céljaikkal, ha a gazdasági és társadalmi szereplők normáihoz, konvencióihoz igazítva alakítják ki intézményformáló stratégiáikat. __________ The competitiveness of nations is based on the successful function of the institutions that support the division of labor on value creation – this is the basic principle of this research. Our project investigates what values and motivations shape the institutional setting of Hungarian economy. We study the world of norms, conventions and innovations – the elements that shape the institutions. The static analysis of official rules has only a minor role in this approach. Research focuses (1) on the value system of entrepreneurs (2), on the mind setting of public managers and executives of economic policy (3) on the factors of local economic competitiveness, (4) on the actions of private and non-profit sector in order to enhance competitiveness. The main finding of this research is that the cognitive factors that shape the competitiveness of Hungary – the norms and motivations of decision makers in the economy – give a positive support for the competitiveness strengthening initiatives of business, non-profit and public sectors. The studies on the values system of entrepreneurs and citizens show that expectations and moral values connected to competitiveness are strong. The public managers of economic policy are open-minded and there is a general consensus of experts, business and politics on the key competitiveness challenges of Hungary. There are well defined frameworks to conceptualize the schemes that make organizers’ efforts affordable in private initiatives for competitiveness. There are various developments on the field of institutions for local economic development. The nonprofit sector has good results on the common fields of competiveness and equity (like atypical forms of employment) despite the uncertainties in the background of the sector. These results open perspectives both for scientific research and practical applications. The research on connection between individual goals and motivation to improve value creating ability of the society and the study of social innovation reveal new aspects of competitiveness. Business, non-profit or public leaders can better synchronize their strategies with the expectation of consumers, communities and constituencies if their intentions to shape institutional settings fit better to the norms and conventions of the social and economic stakeholders.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The nation's freeway systems are becoming increasingly congested. A major contribution to traffic congestion on freeways is due to traffic incidents. Traffic incidents are non-recurring events such as accidents or stranded vehicles that cause a temporary roadway capacity reduction, and they can account for as much as 60 percent of all traffic congestion on freeways. One major freeway incident management strategy involves diverting traffic to avoid incident locations by relaying timely information through Intelligent Transportation Systems (ITS) devices such as dynamic message signs or real-time traveler information systems. The decision to divert traffic depends foremost on the expected duration of an incident, which is difficult to predict. In addition, the duration of an incident is affected by many contributing factors. Determining and understanding these factors can help the process of identifying and developing better strategies to reduce incident durations and alleviate traffic congestion. A number of research studies have attempted to develop models to predict incident durations, yet with limited success. ^ This dissertation research attempts to improve on this previous effort by applying data mining techniques to a comprehensive incident database maintained by the District 4 ITS Office of the Florida Department of Transportation (FDOT). Two categories of incident duration prediction models were developed: "offline" models designed for use in the performance evaluation of incident management programs, and "online" models for real-time prediction of incident duration to aid in the decision making of traffic diversion in the event of an ongoing incident. Multiple data mining analysis techniques were applied and evaluated in the research. The multiple linear regression analysis and decision tree based method were applied to develop the offline models, and the rule-based method and a tree algorithm called M5P were used to develop the online models. ^ The results show that the models in general can achieve high prediction accuracy within acceptable time intervals of the actual durations. The research also identifies some new contributing factors that have not been examined in past studies. As part of the research effort, software code was developed to implement the models in the existing software system of District 4 FDOT for actual applications. ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Prior to 2000, there were less than 1.6 million students enrolled in at least one online course. By fall 2010, student enrollment in online distance education showed a phenomenal 283% increase to 6.1 million. Two years later, this number had grown to 7.1 million. In light of this significant growth and skepticism about quality, there have been calls for greater oversight of this format of educational delivery. Accrediting bodies tasked with this oversight have developed guidelines and standards for online education. There is a lack of empirical studies that examine the relationship between accrediting standards and student success. The purpose of this study was to examine the relationship between the presence of Southern Association of Colleges and Schools Commission on College (SACSCOC) standards for online education in online courses, (a) student support services and (b) curriculum and instruction, and student success. An original 24-item survey with an overall reliability coefficient of .94 was administered to students (N=464) at Florida International University, enrolled in 24 university-wide undergraduate online courses during fall 2014, who rated the presence of these standards in their online courses. The general linear model was utilized to analyze the data. The results of the study indicated that the two standards, student support services and curriculum and instruction were both significantly and positively correlated with student success but with small R2 and strengths of association less than .35 and .20 respectively. Mixed results were produced from Chi-square tests for differences in student success between higher and lower rated online courses when controlling for various covariates such as discipline, gender, race/ethnicity, GPA, age, and number of online courses previously taken. A multiple linear regression analysis revealed that the curriculum and instruction standard was the only variable that accounted for a significant amount of unique variance in student success. Another regression test revealed that no significant interaction effect exists between the two SACSCOC standards and GPA in predicting student success. The results of this study are useful for administrators, faculty, and researchers who are interested in accreditation standards for online education and how these standards relate to student success.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Kernel-level malware is one of the most dangerous threats to the security of users on the Internet, so there is an urgent need for its detection. The most popular detection approach is misuse-based detection. However, it cannot catch up with today's advanced malware that increasingly apply polymorphism and obfuscation. In this thesis, we present our integrity-based detection for kernel-level malware, which does not rely on the specific features of malware. ^ We have developed an integrity analysis system that can derive and monitor integrity properties for commodity operating systems kernels. In our system, we focus on two classes of integrity properties: data invariants and integrity of Kernel Queue (KQ) requests. ^ We adopt static analysis for data invariant detection and overcome several technical challenges: field-sensitivity, array-sensitivity, and pointer analysis. We identify data invariants that are critical to system runtime integrity from Linux kernel 2.4.32 and Windows Research Kernel (WRK) with very low false positive rate and very low false negative rate. We then develop an Invariant Monitor to guard these data invariants against real-world malware. In our experiment, we are able to use Invariant Monitor to detect ten real-world Linux rootkits and nine real-world Windows malware and one synthetic Windows malware. ^ We leverage static and dynamic analysis of kernel and device drivers to learn the legitimate KQ requests. Based on the learned KQ requests, we build KQguard to protect KQs. At runtime, KQguard rejects all the unknown KQ requests that cannot be validated. We apply KQguard on WRK and Linux kernel, and extensive experimental evaluation shows that KQguard is efficient (up to 5.6% overhead) and effective (capable of achieving zero false positives against representative benign workloads after appropriate training and very low false negatives against 125 real-world malware and nine synthetic attacks). ^ In our system, Invariant Monitor and KQguard cooperate together to protect data invariants and KQs in the target kernel. By monitoring these integrity properties, we can detect malware by its violation of these integrity properties during execution.^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The time series analysis has played an increasingly important role in weather and climate studies. The success of these studies depends crucially on the knowledge of the quality of climate data such as, for instance, air temperature and rainfall data. For this reason, one of the main challenges for the researchers in this field is to obtain homogeneous series. A time series of climate data is considered homogeneous when the values of the observed data can change only due to climatic factors, i.e., without any interference from external non-climatic factors. Such non-climatic factors may produce undesirable effects in the time series, as unrealistic homogeneity breaks, trends and jumps. In the present work it was investigated climatic time series for the city of Natal, RN, namely air temperature and rainfall time series, for the period spanning from 1961 to 2012. The main purpose was to carry out an analysis in order to check the occurrence of homogeneity breaks or trends in the series under investigation. To this purpose, it was applied some basic statistical procedures, such as normality and independence tests. The occurrence of trends was investigated by linear regression analysis, as well as by the Spearman and Mann-Kendall tests. The homogeneity was investigated by the SNHT, as well as by the Easterling-Peterson and Mann-Whitney-Pettit tests. Analyzes with respect to normality showed divergence in their results. The von Neumann ratio test showed that in the case of the air temperature series the data are not independent and identically distributed (iid), whereas for the rainfall series the data are iid. According to the applied testings, both series display trends. The mean air temperature series displays an increasing trend, whereas the rainfall series shows an decreasing trend. Finally, the homogeneity tests revealed that all series under investigations present inhomogeneities, although they breaks depend on the applied test. In summary, the results showed that the chosen techniques may be applied in order to verify how well the studied time series are characterized. Therefore, these results should be used as a guide for further investigations about the statistical climatology of Natal or even of any other place.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The scope of this study was to identify socioeconomic contextual and health care factors in primary care associated with maternal near misses and their marker conditions. This is an ecological study that used aggregated data of 63 clusters formed by the municipalities of State of Rio Grande do Norte, Brazil, using the Skater method of area regionalization, as the unit of analysis. The ratio of maternal near misses and their marker conditions were obtained from the Hospital Information System of the Brazilian Unified Health System. In multiple linear regression analysis, there was a significant association between maternal near misses and variables of poverty and poor primary health care. Hypertensive disorders were also associated with poverty and poor primary care and the occurrence of hemorrhaging was associated with infant mortality. It was observed that the occurrence of maternal near misses is linked to unfavorable socioeconomic conditions and poor quality health care that are a reflection of public policies that accentuate health inequalities.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The scope of this study was to identify socioeconomic contextual and health care factors in primary care associated with maternal near misses and their marker conditions. This is an ecological study that used aggregated data of 63 clusters formed by the municipalities of State of Rio Grande do Norte, Brazil, using the Skater method of area regionalization, as the unit of analysis. The ratio of maternal near misses and their marker conditions were obtained from the Hospital Information System of the Brazilian Unified Health System. In multiple linear regression analysis, there was a significant association between maternal near misses and variables of poverty and poor primary health care. Hypertensive disorders were also associated with poverty and poor primary care and the occurrence of hemorrhaging was associated with infant mortality. It was observed that the occurrence of maternal near misses is linked to unfavorable socioeconomic conditions and poor quality health care that are a reflection of public policies that accentuate health inequalities.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The characteristics profile of individuals who develop AIDS in Brazil has changed over time. Among these modifications, a worrying finding is the increased incidence of AIDS in the elderly across the country. But, however, is not yet clear whether the increase in AIDS cases is sufficient to produce a change in the trend of measures in recent years in the Brazilian states, and this increase has an effect from the socioeconomic and demographic indicators. In this sense, the objective of this study is to analyze the AIDS incidence rates among the elderly in Brazil and its effect on socioeconomic and demographic inequalities in the period 2000 to 2012. This is an ecological time-series study to meet behavior of the time series of the incidence rates of AIDS in the elderly from 2000 to 2012. the rates were calculated using the secondary data from Diseases Information System Notification and the Brazilian Institute of Geography and Statistics. Data were analyzed statistically to know the trends in incidence rates, by polynomial regression model and joinpoint log-linear regression model, but also the simple linear regression analysis to find the relationship of trends with variables socioeconomic and demographic. SPSS 20.0® and Joinpoint 4.1.1 programs were used. All tests were carried out considering a significance of 5%. After the analysis, in Brazil were reported 62,052 new cases of AIDS in the elderly from 2000 to 2012. During this period, a significant increase was found for males, both aged 50-59 years (APPC: 3.46 %, p <0.001), such as above 59 years (AAPC: 4.38%; p <0.001). For females, the increase was significant and has the largest increments in the time series, when compared to males in both age groups (AAPC: 4.62%, p <0.001 and AAPC: 6.53%; p <0.001) respectively. The largest increases are observed in women and in the states of North and Northeast. In the Southeast Region is observed stabilization of rates throughout the series. The reason of trends between the sexes had a significant reduction, but also an approach in both age groups of the study, reaching a ratio of 1.7 males for every female in the youngest age group. The trends were related to illiteracy rates, with increasing social inequality and the lowest human development in the Brazilian states. We conclude that in Brazil the incidence of AIDS in the elderly follows an increasing trend in individuals over 50 years. Noteworthy are the highest rates of study in women and in the states of North and Northeast. In this sense, the country needs to enhance policies towards older people with STD / AIDS, training health professionals and developing effective measures for the prevention and early diagnosis of infected people, especially in places with limited resources and high social inequality. In the long term, it is developing new studies to understand whether the measures taken were effective in reducing the trends identified in this study.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This thesis studies the static and seismic behavior of simple structures made with gabion box walls. The analysis was performed considering a one-story building with standard dimensions in plan (6m x 5m) and a lightweight timber roof. The main focus of the present investigation is to find the principals aspects of the seismic behavior of a one story building made with gabion box walls, in order to prevent a failure due to seismic actions and in this way help to reduce the seismic risk of developing countries where this natural disaster have a significant intensity. Regarding the gabion box wall, it has been performed some calculations and analysis in order to understand the static and dynamic behavior. From the static point of view, it has been performed a verification of the normal stress computing the normal stress that arrives at the base of the gabion wall and the corresponding capacity of the ground. Moreover, regarding the seismic analysis, it has been studied the in-plane and out-of-plane behavior. The most critical aspect was discovered to be the out-of-plane behavior, for which have been developed models considering the “rigid- no tension model” for masonry, finding a kinematically admissible multiplier that will create a collapse mechanism for the structure. Furthermore, it has been performed a FEM and DEM models to find the maximum displacement at the center of the wall, maximum tension stresses needed for calculating the steel connectors for joining consecutive gabions and the dimensions (length of the wall and distance between orthogonal walls or buttresses) of a geometrical configuration for the standard modulus of the structure, in order to ensure an adequate safety margin for earthquakes with a PGA around 0.4-0.5g. Using the results obtained before, it has been created some rules of thumb, that have to be satisfy in order to ensure a good behavior of these structure.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We consider the suppression of spatiotemporal chaos in the complex GinzburgLandau equation by a combined global and local time-delay feedback. Feedback terms are implemented as a control scheme, i.e., they are proportional to the difference between the time-delayed state of the system and its current state. We perform a linear stability analysis of uniform oscillations with respect to space-dependent perturbations and compare with numerical simulations. Similarly, for the fixed-point solution that corresponds to amplitude death in the spatially extended system, a linear stability analysis with respect to space-dependent perturbations is performed and complemented by numerical simulations. © 2010 Elsevier B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

While it is well known that exposure to radiation can result in cataract formation, questions still remain about the presence of a dose threshold in radiation cataractogenesis. Since the exposure history from diagnostic CT exams is well documented in a patient’s medical record, the population of patients chronically exposed to radiation from head CT exams may be an interesting area to explore for further research in this area. However, there are some challenges in estimating lens dose from head CT exams. An accurate lens dosimetry model would have to account for differences in imaging protocols, differences in head size, and the use of any dose reduction methods.

The overall objective of this dissertation was to develop a comprehensive method to estimate radiation dose to the lens of the eye for patients receiving CT scans of the head. This research is comprised of a physics component, in which a lens dosimetry model was derived for head CT, and a clinical component, which involved the application of that dosimetry model to patient data.

The physics component includes experiments related to the physical measurement of the radiation dose to the lens by various types of dosimeters placed within anthropomorphic phantoms. These dosimeters include high-sensitivity MOSFETs, TLDs, and radiochromic film. The six anthropomorphic phantoms used in these experiments range in age from newborn to adult.

First, the lens dose from five clinically relevant head CT protocols was measured in the anthropomorphic phantoms with MOSFET dosimeters on two state-of-the-art CT scanners. The volume CT dose index (CTDIvol), which is a standard CT output index, was compared to the measured lens doses. Phantom age-specific CTDIvol-to-lens dose conversion factors were derived using linear regression analysis. Since head size can vary among individuals of the same age, a method was derived to estimate the CTDIvol-to-lens dose conversion factor using the effective head diameter. These conversion factors were derived for each scanner individually, but also were derived with the combined data from the two scanners as a means to investigate the feasibility of a scanner-independent method. Using the scanner-independent method to derive the CTDIvol-to-lens dose conversion factor from the effective head diameter, most of the fitted lens dose values fell within 10-15% of the measured values from the phantom study, suggesting that this is a fairly accurate method of estimating lens dose from the CTDIvol with knowledge of the patient’s head size.

Second, the dose reduction potential of organ-based tube current modulation (OB-TCM) and its effect on the CTDIvol-to-lens dose estimation method was investigated. The lens dose was measured with MOSFET dosimeters placed within the same six anthropomorphic phantoms. The phantoms were scanned with the five clinical head CT protocols with OB-TCM enabled on the one scanner model at our institution equipped with this software. The average decrease in lens dose with OB-TCM ranged from 13.5 to 26.0%. Using the size-specific method to derive the CTDIvol-to-lens dose conversion factor from the effective head diameter for protocols with OB-TCM, the majority of the fitted lens dose values fell within 15-18% of the measured values from the phantom study.

Third, the effect of gantry angulation on lens dose was investigated by measuring the lens dose with TLDs placed within the six anthropomorphic phantoms. The 2-dimensional spatial distribution of dose within the areas of the phantoms containing the orbit was measured with radiochromic film. A method was derived to determine the CTDIvol-to-lens dose conversion factor based upon distance from the primary beam scan range to the lens. The average dose to the lens region decreased substantially for almost all the phantoms (ranging from 67 to 92%) when the orbit was exposed to scattered radiation compared to the primary beam. The effectiveness of this method to reduce lens dose is highly dependent upon the shape and size of the head, which influences whether or not the angled scan range coverage can include the entire brain volume and still avoid the orbit.

The clinical component of this dissertation involved performing retrospective patient studies in the pediatric and adult populations, and reconstructing the lens doses from head CT examinations with the methods derived in the physics component. The cumulative lens doses in the patients selected for the retrospective study ranged from 40 to 1020 mGy in the pediatric group, and 53 to 2900 mGy in the adult group.

This dissertation represents a comprehensive approach to lens of the eye dosimetry in CT imaging of the head. The collected data and derived formulas can be used in future studies on radiation-induced cataracts from repeated CT imaging of the head. Additionally, it can be used in the areas of personalized patient dose management, and protocol optimization and clinician training.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A certain type of bacterial inclusion, known as a bacterial microcompartment, was recently identified and imaged through cryo-electron tomography. A reconstructed 3D object from single-axis limited angle tilt-series cryo-electron tomography contains missing regions and this problem is known as the missing wedge problem. Due to missing regions on the reconstructed images, analyzing their 3D structures is a challenging problem. The existing methods overcome this problem by aligning and averaging several similar shaped objects. These schemes work well if the objects are symmetric and several objects with almost similar shapes and sizes are available. Since the bacterial inclusions studied here are not symmetric, are deformed, and show a wide range of shapes and sizes, the existing approaches are not appropriate. This research develops new statistical methods for analyzing geometric properties, such as volume, symmetry, aspect ratio, polyhedral structures etc., of these bacterial inclusions in presence of missing data. These methods work with deformed and non-symmetric varied shaped objects and do not necessitate multiple objects for handling the missing wedge problem. The developed methods and contributions include: (a) an improved method for manual image segmentation, (b) a new approach to 'complete' the segmented and reconstructed incomplete 3D images, (c) a polyhedral structural distance model to predict the polyhedral shapes of these microstructures, (d) a new shape descriptor for polyhedral shapes, named as polyhedron profile statistic, and (e) the Bayes classifier, linear discriminant analysis and support vector machine based classifiers for supervised incomplete polyhedral shape classification. Finally, the predicted 3D shapes for these bacterial microstructures belong to the Johnson solids family, and these shapes along with their other geometric properties are important for better understanding of their chemical and biological characteristics.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Design and analysis of conceptually different cooling systems for the human heart preservation are numerically investigated. A heart cooling container with required connections was designed for a normal size human heart. A three-dimensional, high resolution human heart geometric model obtained from CT-angio data was used for simulations. Nine different cooling designs are introduced in this research. The first cooling design (Case 1) used a cooling gelatin only outside of the heart. In the second cooling design (Case 2), the internal parts of the heart were cooled via pumping a cooling liquid inside both the heart’s pulmonary and systemic circulation systems. An unsteady conjugate heat transfer analysis is performed to simulate the temperature field variations within the heart during the cooling process. Case 3 simulated the currently used cooling method in which the coolant is stagnant. Case 4 was a combination of Case 1 and Case 2. A linear thermoelasticity analysis was performed to assess the stresses applied on the heart during the cooling process. In Cases 5 through 9, the coolant solution was used for both internal and external cooling. For external circulation in Case 5 and Case 6, two inlets and two outlets were designed on the walls of the cooling container. Case 5 used laminar flows for coolant circulations inside and outside of the heart. Effects of turbulent flow on cooling of the heart were studied in Case 6. In Case 7, an additional inlet was designed on the cooling container wall to create a jet impinging the hot region of the heart’s wall. Unsteady periodic inlet velocities were applied in Case 8 and Case 9. The average temperature of the heart in Case 5 was +5.0oC after 1500 s of cooling. Multi-objective constrained optimization was performed for Case 5. Inlet velocities for two internal and one external coolant circulations were the three design variables for optimization. Minimizing the average temperature of the heart, wall shear stress and total volumetric flow rates were the three objectives. The only constraint was to keep von Mises stress below the ultimate tensile stress of the heart’s tissue.