942 resultados para Machine-tools - numerical control
Resumo:
Early intervention and intensive therapy improve the outcome of neuromuscular rehabilitation. There are indications that where a patient is motivated and premeditates their movement, the recovery is more effective. Therefore, a strategy for patient-cooperative control of rehabilitation devices for upper extremities is proposed and evaluated. The strategy is based on the minimal intervention principle allowing an efficient exploitation of task space redundancies and resulting in user-driven movement trajectories. The patient's effort is taken into consideration by enabling the machine to comply with forces exerted by the user. The interaction is enhanced through a multimodal display and a virtually generated environment that includes haptic, visual and sound modalities.
Resumo:
This study presents an integrated mineralogical-geochemical data base on fine-grained sediments transported by all major rivers of southern Africa, including the Zambezi, Okavango, Limpopo, Olifants, Orange and Kunene. Clay mineralogy, bulk geochemistry, Sr and Nd isotopic signatures of river mud, considered as proxy of suspended load, are used to investigate the influence of source-rock lithology and weathering intensity on the composition of clay and silt produced in subequatorial to subtropical latitudes. Depletion in mobile alkali and alkaline-earth metals, minor in arid Namibia, is strong in the Okavango, Kwando and Upper Zambezi catchments, where recycling is also extensive. Element removal is most significant for Na, and to a lesser extent for Sr. Depletion in K, Ca and other elements, negligible in Namibia, is moderate elsewhere. The most widespread clay minerals are smectite, dominant in muds derived from Karoo or Etendeka flood basalts, or illite and chlorite, dominant in muds derived from metasedimentary rocks of the Damara Orogen or Zimbabwe Craton. Kaolinite represents 30-40% of clay minerals only in Okavango and Upper Zambezi sediments sourced in humid subequatorial Angola and Zambia. After subtracting the effects of recycling and of local accumulation of authigenic carbonates in soils, the regional distribution of clay minerals and chemical indices consistently reflect weathering intensity primarily controlled by climate. Bulk geochemistry identifies most clearly volcaniclastic sediments and mafic sources in general, but cannot discriminate the other sources of detritus in detail. Instead, Sr and Nd isotopic fingerprints are insensitive to weathering, and thus mirror faithfully the tectonic structure of the southern African continent. Isotopic tools thus represent a much firmer basis than bulk geochemistry or clay mineralogy in the provenance study of mudrocks.
Resumo:
Is numerical mimicry a third way of establishing truth? Kevin Heng received his M.S. and Ph.D. in astrophysics from the Joint Institute for Laboratory Astrophysics (JILA) and the University of Colorado at Boulder. He joined the Institute for Advanced Study in Princeton from 2007 to 2010, first as a Member and later as the Frank & Peggy Taplin Member. From 2010 to 2012 he was a Zwicky Prize Fellow at ETH Z¨urich (the Swiss Federal Institute of Technology). In 2013, he joined the Center for Space and Habitability (CSH) at the University of Bern, Switzerland, as a tenure-track assistant professor, where he leads the Exoplanets and Exoclimes Group. He has worked on, and maintains, a broad range of interests in astrophysics: shocks, extrasolar asteroid belts, planet formation, fluid dynamics, brown dwarfs and exoplanets. He coordinates the Exoclimes Simulation Platform (ESP), an open-source set of theoretical tools designed for studying the basic physics and chemistry of exoplanetary atmospheres and climates (www.exoclime.org). He is involved in the CHEOPS (Characterizing Exoplanet Satellite) space telescope, a mission approved by the European Space Agency (ESA) and led by Switzerland. He spends a fair amount of time humbly learning the lessons gleaned from studying the Earth and Solar System planets, as related to him by atmospheric, climate and planetary scientists. He received a Sigma Xi Grant-in-Aid of Research in 2006
Resumo:
Aging societies suffer from an increasing incidence of bone fractures. Bone strength depends on the amount of mineral measured by clinical densitometry, but also on the micromechanical properties of the bone hierarchical organization. A good understanding has been reached for elastic properties on several length scales, but up to now there is a lack of reliable postyield data on the lower length scales. In order to be able to describe the behavior of bone at the microscale, an anisotropic elastic-viscoplastic damage model was developed using an eccentric generalized Hill criterion and nonlinear isotropic hardening. The model was implemented as a user subroutine in Abaqus and verified using single element tests. A FE simulation of microindentation in lamellar bone was finally performed show-ing that the new constitutive model can capture the main characteristics of the indentation response of bone. As the generalized Hill criterion is limited to elliptical and cylindrical yield surfaces and the correct shape for bone is not known, a new yield surface was developed that takes any convex quadratic shape. The main advantage is that in the case of material identification the shape of the yield surface does not have to be anticipated but a minimization results in the optimal shape among all convex quadrics. The generality of the formulation was demonstrated by showing its degeneration to classical yield surfaces. Also, existing yield criteria for bone at multiple length scales were converted to the quadric formulation. Then, a computational study to determine the influence of yield surface shape and damage on the in-dentation response of bone using spherical and conical tips was performed. The constitutive model was adapted to the quadric criterion and yield surface shape and critical damage were varied. They were shown to have a major impact on the indentation curves. Their influence on indentation modulus, hardness, their ratio as well as the elastic to total work ratio were found to be very well described by multilinear regressions for both tip shapes. For conical tips, indentation depth was not a significant fac-tor, while for spherical tips damage was insignificant. All inverse methods based on microindentation suffer from a lack of uniqueness of the found material properties in the case of nonlinear material behavior. Therefore, monotonic and cyclic micropillar com-pression tests in a scanning electron microscope allowing a straightforward interpretation comple-mented by microindentation and macroscopic uniaxial compression tests were performed on dry ovine bone to identify modulus, yield stress, plastic deformation, damage accumulation and failure mecha-nisms. While the elastic properties were highly consistent, the postyield deformation and failure mech-anisms differed between the two length scales. A majority of the micropillars showed a ductile behavior with strain hardening until failure by localization in a slip plane, while the macroscopic samples failed in a quasi-brittle fashion with microcracks coalescing into macroscopic failure surfaces. In agreement with a proposed rheological model, these experiments illustrate a transition from a ductile mechanical behavior of bone at the microscale to a quasi-brittle response driven by the growth of preexisting cracks along interfaces or in the vicinity of pores at the macroscale. Subsequently, a study was undertaken to quantify the topological variability of indentations in bone and examine its relationship with mechanical properties. Indentations were performed in dry human and ovine bone in axial and transverse directions and their topography measured by AFM. Statistical shape modeling of the residual imprint allowed to define a mean shape and describe the variability with 21 principal components related to imprint depth, surface curvature and roughness. The indentation profile of bone was highly consistent and free of any pile up. A few of the topological parameters, in particular depth, showed significant correlations to variations in mechanical properties, but the cor-relations were not very strong or consistent. We could thus verify that bone is rather homogeneous in its micromechanical properties and that indentation results are not strongly influenced by small de-viations from the ideal case. As the uniaxial properties measured by micropillar compression are in conflict with the current literature on bone indentation, another dissipative mechanism has to be present. The elastic-viscoplastic damage model was therefore extended to viscoelasticity. The viscoelastic properties were identified from macroscopic experiments, while the quasistatic postelastic properties were extracted from micropillar data. It was found that viscoelasticity governed by macroscale properties has very little influence on the indentation curve and results in a clear underestimation of the creep deformation. Adding viscoplasticity leads to increased creep, but hardness is still highly overestimated. It was possible to obtain a reasonable fit with experimental indentation curves for both Berkovich and spherical indenta-tion when abandoning the assumption of shear strength being governed by an isotropy condition. These results remain to be verified by independent tests probing the micromechanical strength prop-erties in tension and shear. In conclusion, in this thesis several tools were developed to describe the complex behavior of bone on the microscale and experiments were performed to identify its material properties. Micropillar com-pression highlighted a size effect in bone due to the presence of preexisting cracks and pores or inter-faces like cement lines. It was possible to get a reasonable fit between experimental indentation curves using different tips and simulations using the constitutive model and uniaxial properties measured by micropillar compression. Additional experimental work is necessary to identify the exact nature of the size effect and the mechanical role of interfaces in bone. Deciphering the micromechanical behavior of lamellar bone and its evolution with age, disease and treatment and its failure mechanisms on several length scales will help preventing fractures in the elderly in the future.
Resumo:
Dicto is a declarative language for specifying architectural rules using a single uniform notation. Once defined, rules can automatically be validated using adapted off-the-shelf tools.
Resumo:
Foot-and-mouth disease (FMD) is a highly contagious disease that caused several large outbreaks in Europe in the last century. The last important outbreak in Switzerland took place in 1965/66 and affected more than 900 premises and more than 50,000 animals were slaughtered. Large-scale emergency vaccination of the cattle and pig population has been applied to control the epidemic. In recent years, many studies have used infectious disease models to assess the impact of different disease control measures, including models developed for diseases exotic for the specific region of interest. Often, the absence of real outbreak data makes a validation of such models impossible. This study aimed to evaluate whether a spatial, stochastic simulation model (the Davis Animal Disease Simulation model) can predict the course of a Swiss FMD epidemic based on the available historic input data on population structure, contact rates, epidemiology of the virus, and quality of the vaccine. In addition, the potential outcome of the 1965/66 FMD epidemic without application of vaccination was investigated. Comparing the model outcomes to reality, only the largest 10% of the simulated outbreaks approximated the number of animals being culled. However, the simulation model highly overestimated the number of culled premises. While the outbreak duration could not be well reproduced by the model compared to the 1965/66 epidemic, it was able to accurately estimate the size of the area infected. Without application of vaccination, the model predicted a much higher mean number of culled animals than with vaccination, demonstrating that vaccination was likely crucial in disease control for the Swiss FMD outbreak in 1965/66. The study demonstrated the feasibility to analyze historical outbreak data with modern analytical tools. However, it also confirmed that predicted epidemics from a most carefully parameterized model cannot integrate all eventualities of a real epidemic. Therefore, decision makers need to be aware that infectious disease models are useful tools to support the decision-making process but their results are not equal valuable as real observations and should always be interpreted with caution.
Resumo:
Strategic control is defined as the use of qualitative and quantitative tools for the evaluation of strategic organizational performance. Most research in strategic planning has focused on strategy formulation and implementation, but little work has been done on strategic performance evaluation particularly in the area of cancer research. The objective of this study was to identify strategic control approaches and financial performance metrics used by major cancer centers in the country as an initial step in expanding the theory and practice behind strategic organizational performance. Focusing on hospitals which share similar mandate and resource constraints was expected to improve measurement precision. The results indicate that most cancer centers use a wide selection of evaluation tools, but sophisticated analytical approaches were less common. In addition, there was evidence that high-performing centers tend to invest a larger degree of resources in the area of strategic performance analysis than centers showing lower financial results. The conclusions point to the need for incorporating higher degree of analytical power in order to improve the tracking of strategic performance. This study is one of the first to concentrate in the area of strategic control.^
Resumo:
Cryoablation for small renal tumors has demonstrated sufficient clinical efficacy over the past decade as a non-surgical nephron-sparing approach for treating renal masses for patients who are not surgical candidates. Minimally invasive percutaneous cryoablations have been performed with image guidance from CT, ultrasound, and MRI. During the MRI-guided cryoablation procedure, the interventional radiologist visually compares the iceball size on monitoring images with respect to the original tumor on separate planning images. The comparisons made during the monitoring step are time consuming, inefficient and sometimes lack the precision needed for decision making, requiring the radiologist to make further changes later in the procedure. This study sought to mitigate uncertainty in these visual comparisons by quantifying tissue response to cryoablation and providing visualization of the response during the procedure. Based on retrospective analysis of MR-guided cryoablation patient data, registration and segmentation algorithms were investigated and implemented for periprocedural visualization to deliver iceball position/size with respect to planning images registered within 3.3mm with at least 70% overlap and a quantitative logit model was developed to relate perfusion deficit in renal parenchyma visualized in verification images as a result of iceball size visualized in monitoring images. Through retrospective study of 20 patient cases, the relationship between likelihood of perfusion loss in renal parenchyma and distance within iceball was quantified and iteratively fit to a logit curve. Using the parameters from the logit fit, the margin for 95% perfusion loss likelihood was found to be 4.28 mm within the iceball. The observed margin corresponds well with the clinically accepted margin of 3-5mm within the iceball. In order to display the iceball position and perfusion loss likelihood to the radiologist, algorithms were implemented to create a fast segmentation and registration module which executed in under 2 minutes, within the clinically-relevant 3 minute monitoring period. Using 16 patient cases, the average Hausdorff distance was reduced from 10.1mm to 3.21 mm with average DSC increased from 46.6% to 82.6% before and after registration.
Resumo:
Este trabajo responde al propósito de reflexionar acerca de las formas que toma el control de la administración pública en nuestro país, ahondando especialmente en el control social. El recorrido y análisis de diversas normas y programas, además de su cruce con las distintas modalidades del control, muestran diferentes instancias de participación que constituyen nuevos campos de intervención ciudadana. En todas y cada una de ellas, el acceso a la información pública aparece como auténtico presupuesto de la participación. Nuevos instrumentos y modalidades en los que, sin embargo, debemos resaltar el carácter de no vinculante que tienen las audiencias públicas y el escaso margen generado para la participación activa en los distintos programas. De todas maneras esos instrumentos son perfectibles y a futuro habrá de pensarse en el modo de transformar esta incipiente participación en una intervención activa y vigorosa que defina una nueva relación Estado/sociedad.
Resumo:
A partir de la aceptación de la biblioteca como parte del ciclo de creación, organización y diseminación del conocimiento cambió el concepto de la misma de una entidad cerrada hacia un sistema dinámico en constante interacción con su entorno. Así se la reconoció como una institución social más que como una colección de documentos. Desde entonces se percibió a la biblioteca como una entidad en la que se podía aplicar los principios de gestión. Desde entonces se utilizaron distintas herramientas de gestión para la toma de decisiones en el ámbito de las bibliotecas. Entre estas herramientas son de gran importancia en el control estadístico de procesos los gráficos de control, utilizados para medir la estabilidad de un proceso a través del tiempo. Han tenido amplia aplicación en el control estadístico de la calidad, comenzando en el ámbito industrial. Hoy su aplicación se ha extendido a una gran variedad de disciplinas incluyen empresas de servicios y unidades administrativas. Aquí se presentan a los gráficos de control como una importante herramienta de gestión aplicada a los procesos técnicos permitiendo su evaluación y el monitoreo de su desempeño a partir de la utilización de indicadores y otros datos de carácter diagnóstico
Resumo:
A partir de la aceptación de la biblioteca como parte del ciclo de creación, organización y diseminación del conocimiento cambió el concepto de la misma de una entidad cerrada hacia un sistema dinámico en constante interacción con su entorno. Así se la reconoció como una institución social más que como una colección de documentos. Desde entonces se percibió a la biblioteca como una entidad en la que se podía aplicar los principios de gestión. Desde entonces se utilizaron distintas herramientas de gestión para la toma de decisiones en el ámbito de las bibliotecas. Entre estas herramientas son de gran importancia en el control estadístico de procesos los gráficos de control, utilizados para medir la estabilidad de un proceso a través del tiempo. Han tenido amplia aplicación en el control estadístico de la calidad, comenzando en el ámbito industrial. Hoy su aplicación se ha extendido a una gran variedad de disciplinas incluyen empresas de servicios y unidades administrativas. Aquí se presentan a los gráficos de control como una importante herramienta de gestión aplicada a los procesos técnicos permitiendo su evaluación y el monitoreo de su desempeño a partir de la utilización de indicadores y otros datos de carácter diagnóstico
Resumo:
The importance of renewable energies for the European electricity market is growing rapidly. This presents transmission grids and the power market in general with new challenges which stem from the higher spatiotemporal variability of power generation. This uncertainty is due to the fact that renewable power production results from weather phenomena, thus making it difficult to plan and control. We present a sensitivity study of a total solar eclipse in central Europe in March. The weather in Germany and Europe was modeled using the German Weather Service's local area models COSMO-DE and COSMO-EU, respectively (http://www.cosmo-model.org/). The simulations were performed with and without considering a solar eclipse for the following 3 situations: 1. An idealized, clear-sky situation for the entire model area (Europe, COSMO-EU) 2. A real weather situation with mostly cloudy skies (Germany, COSMO-DE) 3. A real weather situation with mostly clear skies (Germany, COSMO-DE) The data should help to evaluate the effects of a total solar eclipse on the weather in the planetary boundary layer. The results show that a total solar eclipse has significant effects particularly on the main variables for renewable energy production, such as solar irradiation and temperature near the ground.
Resumo:
A partir de la aceptación de la biblioteca como parte del ciclo de creación, organización y diseminación del conocimiento cambió el concepto de la misma de una entidad cerrada hacia un sistema dinámico en constante interacción con su entorno. Así se la reconoció como una institución social más que como una colección de documentos. Desde entonces se percibió a la biblioteca como una entidad en la que se podía aplicar los principios de gestión. Desde entonces se utilizaron distintas herramientas de gestión para la toma de decisiones en el ámbito de las bibliotecas. Entre estas herramientas son de gran importancia en el control estadístico de procesos los gráficos de control, utilizados para medir la estabilidad de un proceso a través del tiempo. Han tenido amplia aplicación en el control estadístico de la calidad, comenzando en el ámbito industrial. Hoy su aplicación se ha extendido a una gran variedad de disciplinas incluyen empresas de servicios y unidades administrativas. Aquí se presentan a los gráficos de control como una importante herramienta de gestión aplicada a los procesos técnicos permitiendo su evaluación y el monitoreo de su desempeño a partir de la utilización de indicadores y otros datos de carácter diagnóstico
Resumo:
Most Cenozoic nannofossil and many foraminiferal zonal boundaries have been accurately determined and magnetostratigraphically calibrated at five Leg 73 boreholes. The numerical ages of the boundaries were computed by assuming a linear seafloor spreading rate and a radiometric age of 66.5 m.y. for the Cretaceous/Tertiary boundary. Alternative magnetostratigraphic ages (given below in parentheses) were obtained by adopting a 63.5 m.y. age for the Cenozoic. Our data confirm previous determinations of the Pleistocene/Pliocene boundary at 1.8 (1.7) m.y. and of the Pliocene/ Miocene boundary at 5.1 (5.0) m.y. The Miocene/Oligocene boundary is placed within Chron C-6C and has a magnetostratigraphic age of 23.8 to 24.0 (22.7 to 22.9) m.y. The Oligocene/Eocene boundary is also very precisely located within Chron C-13-R, with a magnetostratigraphic age of 37.1 to 37.2 (35.5 to 35.6) m.y. The Eocene/Paleocene boundary should be located within an uncored interval of Chron C-24 and have a magnetostratigraphic age of 59.0 (55.4) +/- 0.2 m.y. The general accord of the magnetostratigraphic and radiometric ages supports the hypothesis that the seafloor spreading rate was linear during the Cenozoic. Two possible exceptions are noted: the middle Miocene radiometric ages are a few million years older, and the early Eocene radiometric ages are several million years younger, than the corresponding magnetostratigraphic ages.
Resumo:
Monitoring the impact of sea storms on coastal areas is fundamental to study beach evolution and the vulnerability of low-lying coasts to erosion and flooding. Modelling wave runup on a beach is possible, but it requires accurate topographic data and model tuning, that can be done comparing observed and modeled runup. In this study we collected aerial photos using an Unmanned Aerial Vehicle after two different swells on the same study area. We merged the point cloud obtained with photogrammetry with multibeam data, in order to obtain a complete beach topography. Then, on each set of rectified and georeferenced UAV orthophotos, we identified the maximum wave runup for both events recognizing the wet area left by the waves. We then used our topography and numerical models to simulate the wave runup and compare the model results to observed values during the two events. Our results highlight the potential of the methodology presented, which integrates UAV platforms, photogrammetry and Geographic Information Systems to provide faster and cheaper information on beach topography and geomorphology compared with traditional techniques without losing in accuracy. We use the results obtained from this technique as a topographic base for a model that calculates runup for the two swells. The observed and modeled runups are consistent, and open new directions for future research.