971 resultados para Polímers -- Proves
Resumo:
This paper examines the accuracy of software-based on-line energy estimation techniques. It evaluates today’s most widespread energy estimation model in order to investigate whether the current methodology of pure software-based energy estimation running on a sensor node itself can indeed reliably and accurately determine its energy consumption - independent of the particular node instance, the traffic load the node is exposed to, or the MAC protocol the node is running. The paper enhances today’s widely used energy estimation model by integrating radio transceiver switches into the model, and proposes a methodology to find the optimal estimation model parameters. It proves by statistical validation with experimental data that the proposed model enhancement and parameter calibration methodology significantly increases the estimation accuracy.
Resumo:
The early detection of subjects with probable Alzheimer's disease (AD) is crucial for effective appliance of treatment strategies. Here we explored the ability of a multitude of linear and non-linear classification algorithms to discriminate between the electroencephalograms (EEGs) of patients with varying degree of AD and their age-matched control subjects. Absolute and relative spectral power, distribution of spectral power, and measures of spatial synchronization were calculated from recordings of resting eyes-closed continuous EEGs of 45 healthy controls, 116 patients with mild AD and 81 patients with moderate AD, recruited in two different centers (Stockholm, New York). The applied classification algorithms were: principal component linear discriminant analysis (PC LDA), partial least squares LDA (PLS LDA), principal component logistic regression (PC LR), partial least squares logistic regression (PLS LR), bagging, random forest, support vector machines (SVM) and feed-forward neural network. Based on 10-fold cross-validation runs it could be demonstrated that even tough modern computer-intensive classification algorithms such as random forests, SVM and neural networks show a slight superiority, more classical classification algorithms performed nearly equally well. Using random forests classification a considerable sensitivity of up to 85% and a specificity of 78%, respectively for the test of even only mild AD patients has been reached, whereas for the comparison of moderate AD vs. controls, using SVM and neural networks, values of 89% and 88% for sensitivity and specificity were achieved. Such a remarkable performance proves the value of these classification algorithms for clinical diagnostics.
Resumo:
Uncontrollable intracranial pressure elevation in hyperacute liver failure often proves fatal if no suitable liver for transplantation is found in due time. Both ABO-compatible and auxiliary partial orthotopic liver transplantation have been described to control such scenario. However, each method is associated with downsides in terms of immunobiology, organ availability and effects on the overall waiting list.
Resumo:
In 1969, Lovasz asked whether every connected, vertex-transitive graph has a Hamilton path. This question has generated a considerable amount of interest, yet remains vastly open. To date, there exist no known connected, vertex-transitive graph that does not possess a Hamilton path. For the Cayley graphs, a subclass of vertex-transitive graphs, the following conjecture was made: Weak Lovász Conjecture: Every nontrivial, finite, connected Cayley graph is hamiltonian. The Chen-Quimpo Theorem proves that Cayley graphs on abelian groups flourish with Hamilton cycles, thus prompting Alspach to make the following conjecture: Alspach Conjecture: Every 2k-regular, connected Cayley graph on a finite abelian group has a Hamilton decomposition. Alspach’s conjecture is true for k = 1 and 2, but even the case k = 3 is still open. It is this case that this thesis addresses. Chapters 1–3 give introductory material and past work on the conjecture. Chapter 3 investigates the relationship between 6-regular Cayley graphs and associated quotient graphs. A proof of Alspach’s conjecture is given for the odd order case when k = 3. Chapter 4 provides a proof of the conjecture for even order graphs with 3-element connection sets that have an element generating a subgroup of index 2, and having a linear dependency among the other generators. Chapter 5 shows that if Γ = Cay(A, {s1, s2, s3}) is a connected, 6-regular, abelian Cayley graph of even order, and for some1 ≤ i ≤ 3, Δi = Cay(A/(si), {sj1 , sj2}) is 4-regular, and Δi ≄ Cay(ℤ3, {1, 1}), then Γ has a Hamilton decomposition. Alternatively stated, if Γ = Cay(A, S) is a connected, 6-regular, abelian Cayley graph of even order, then Γ has a Hamilton decomposition if S has no involutions, and for some s ∈ S, Cay(A/(s), S) is 4-regular, and of order at least 4. Finally, the Appendices give computational data resulting from C and MAGMA programs used to generate Hamilton decompositions of certain non-isomorphic Cayley graphs on low order abelian groups.
Resumo:
Four papers, written in collaboration with the author’s graduate school advisor, are presented. In the first paper, uniform and non-uniform Berry-Esseen (BE) bounds on the convergence to normality of a general class of nonlinear statistics are provided; novel applications to specific statistics, including the non-central Student’s, Pearson’s, and the non-central Hotelling’s, are also stated. In the second paper, a BE bound on the rate of convergence of the F-statistic used in testing hypotheses from a general linear model is given. The third paper considers the asymptotic relative efficiency (ARE) between the Pearson, Spearman, and Kendall correlation statistics; conditions sufficient to ensure that the Spearman and Kendall statistics are equally (asymptotically) efficient are provided, and several models are considered which illustrate the use of such conditions. Lastly, the fourth paper proves that, in the bivariate normal model, the ARE between any of these correlation statistics possesses certain monotonicity properties; quadratic lower and upper bounds on the ARE are stated as direct applications of such monotonicity patterns.
Resumo:
Nanoparticles are fascinating where physical and optical properties are related to size. Highly controllable synthesis methods and nanoparticle assembly are essential [6] for highly innovative technological applications. Among nanoparticles, nonhomogeneous core-shell nanoparticles (CSnp) have new properties that arise when varying the relative dimensions of the core and the shell. This CSnp structure enables various optical resonances, and engineered energy barriers, in addition to the high charge to surface ratio. Assembly of homogeneous nanoparticles into functional structures has become ubiquitous in biosensors (i.e. optical labeling) [7, 8], nanocoatings [9-13], and electrical circuits [14, 15]. Limited nonhomogenous nanoparticle assembly has only been explored. Many conventional nanoparticle assembly methods exist, but this work explores dielectrophoresis (DEP) as a new method. DEP is particle polarization via non-uniform electric fields while suspended in conductive fluids. Most prior DEP efforts involve microscale particles. Prior work on core-shell nanoparticle assemblies and separately, nanoparticle characterizations with dielectrophoresis and electrorotation [2-5], did not systematically explore particle size, dielectric properties (permittivity and electrical conductivity), shell thickness, particle concentration, medium conductivity, and frequency. This work is the first, to the best of our knowledge, to systematically examine these dielectrophoretic properties for core-shell nanoparticles. Further, we conduct a parametric fitting to traditional core-shell models. These biocompatible core-shell nanoparticles were studied to fill a knowledge gap in the DEP field. Experimental results (chapter 5) first examine medium conductivity, size and shell material dependencies of dielectrophoretic behaviors of spherical CSnp into 2D and 3D particle-assemblies. Chitosan (amino sugar) and poly-L-lysine (amino acid, PLL) CSnp shell materials were custom synthesized around a hollow (gas) core by utilizing a phospholipid micelle around a volatile fluid templating for the shell material; this approach proves to be novel and distinct from conventional core-shell models wherein a conductive core is coated with an insulative shell. Experiments were conducted within a 100 nl chamber housing 100 um wide Ti/Au quadrapole electrodes spaced 25 um apart. Frequencies from 100kHz to 80MHz at fixed local field of 5Vpp were tested with 10-5 and 10-3 S/m medium conductivities for 25 seconds. Dielectrophoretic responses of ~220 and 340(or ~400) nm chitosan or PLL CSnp were compiled as a function of medium conductivity, size and shell material.
Resumo:
PURPOSE: The aim of this study was to analyze prosthetic maintenance in partially edentulous patients with removable prostheses supported by teeth and strategic implants. MATERIALS AND METHODS: Sixty patients with removable partial prostheses and combined tooth-implant support were identified within the time period from 1998 to 2006. One group consisted of 42 patients (planned group) with a reduced residual dentition and in need of removable partial dentures (RPDs) or overdentures in the maxilla and/or mandible. They were admitted consecutively for treatment. Due to missing teeth in strategic important positions, one or two implants were placed to improve symmetrical denture support and retention. The majority of residual teeth exhibited an impaired structural integrity and therefore were provided with root copings for denture retention. A few vital teeth were used for telescopic crowns. The anchorage system for the strategic implants was selected accordingly. A second group of 18 patients (repair group) wearing RPDs with the loss of one abutment tooth due to biologic or mechanical failure was identified. These abutment teeth were replaced by 21 implants, and patients continued to wear their original prostheses. The observation time for planned and repair groups was 12 months to 8 years. All patients followed a regular maintenance schedule. Technical or biologic complications with supporting teeth or implants and prosthetic service were registered regularly. RESULTS: Three maxillary implants were lost after loading and three roots with copings had to be removed. Biologic problems included caries and periodontal/peri-implant infection with a significantly higher incidence in the repair group (P < .05). Technical complications with the dentures were rather frequent in both groups, mostly related to the anchorage system (matrices) of root copings and implants. Maintenance and complications were observed more frequently in the first year after delivery of the denture than in the following 3 years (P < .05). No denture had to be remade. CONCLUSIONS: The placement of a few implants allows for maintaining a compromised residual dentition for support of RPDs. The combination of root and implant support facilitates treatment planning and enhances designing the removable denture. It also proves to be a practical rescue method. Technical problems with the anchorage system were frequent, particularly in the first year after delivery of the dentures.
Resumo:
PURPOSE: Resonance frequency analysis (RFA) offers the opportunity to monitor the osseointegration of an implant in a simple, noninvasive way. A better comprehension of the relationship between RFA and parameters related to bone quality would therefore help clinicians improve diagnoses. In this study, a bone analog made from polyurethane foam was used to isolate the influences of bone density and cortical thickness in RFA. MATERIALS AND METHODS: Straumann standard implants were inserted in polyurethane foam blocks, and primary implant stability was measured with RFA. The blocks were composed of two superimposed layers with different densities. The top layer was dense to mimic cortical bone, whereas the bottom layer had a lower density to represent trabecular bone. Different densities for both layers and different thicknesses for the simulated cortical layer were tested, resulting in eight different block combinations. RFA was compared with two other mechanical evaluations of primary stability: removal torque and axial loading response. RESULTS: The primary stability measured with RFA did not correlate with the two other methods, but there was a significant correlation between removal torque and the axial loading response (P < .005). Statistical analysis revealed that each method was sensitive to different aspects of bone quality. RFA was the only method able to detect changes in both bone density and cortical thickness. However, changes in trabecular bone density were easier to distinguish with removal torque and axial loading than with RFA. CONCLUSIONS: This study shows that RFA, removal torque, and axial loading are sensitive to different aspects of the bone-implant interface. This explains the absence of correlation among the methods and proves that no standard procedure exists for the evaluation of primary stability.
Resumo:
„Geiz ist geil!“ ist das Motto eines Handelshauses. Aber diese Philosophie erweist sich immer häufiger als ungeeignet, längerfristigen Erfolg zu sichern. Oftmals wird gerade aus diesem Grund wieder verstärkt auf Qualität geachtet, auf die Qualität von Produkten, auf die Qualität von Herstellungsprozessen, auf die Qualität von Logistikprozessen etc. Dieser Sinneswandel beeinflusst auch alle Verpackungsprozesse, da diese untrennbar mit der Sicherung der Produktqualität und der sicheren Abwicklung aller logistischen Prozesse verbunden ist. Neben der Forderung nach einem wirtschaftlichen Produktschutz als Kernaufgabe der Verpackung müssen jedoch auch zwingende Vorgaben – beispielsweise seitens des Gesetzgebers (z. B. im Lebensmittelbereich, in der Gefahrgutlogistik, im Straßenverkehrsrecht) – beachtet werden. Das führt u. a. dazu, dass alle verpackten Güter so geschützt sein sollten, dass sie den Belastungen im Transportprozess, aber auch den Belastungen aufgrund von Ladungs- und Ladeeinheitensicherungsmaßnahmen standhalten können. Da sich jedoch nicht alle Ladegüter oder Packstücke beliebig für form- oder kraftschlüssige Sicherungsmaßnahmen eignen, sollte bei der Auslegung von Verpackungsmaßnahmen insbesondere der hilfreichen Wirkung von Reibungskräften zur Reduzierung zusätzlicher Sicherungsmaßnahmen Aufmerksamkeit gewidmet werden.
Resumo:
„Geiz ist geil!“ ist das Motto eines Handelshauses. Aber diese Philosophie erweist sich immer häufiger als ungeeignet, längerfristigen Erfolg zu sichern. Oftmals wird gerade aus diesem Grund wieder verstärkt auf Qualität geachtet, auf die Qualität von Produkten, auf die Qualität von Herstellungsprozessen, auf die Qualität von Logistikprozessen etc. Dieser Sinneswandel beeinflusst auch alle Verpackungsprozesse, da diese untrennbar mit der Sicherung der Produktqualität und der sicheren Abwicklung aller logistischen Prozesse verbunden ist. Neben der Forderung nach einem wirtschaftlichen Produktschutz als Kernaufgabe der Verpackung müssen jedoch auch zwingende Vorgaben – beispielsweise seitens des Gesetzgebers (z. B. im Lebensmittelbereich, in der Gefahrgutlogistik, im Straßenverkehrsrecht) – beachtet werden. Das führt u. a. dazu, dass alle verpackten Güter so geschützt sein sollten, dass sie den Belastungen im Transportprozess, aber auch den Belastungen aufgrund von Ladungs- und Ladeeinheitensicherungsmaßnahmen standhalten können. Da sich jedoch nicht alle Ladegüter oder Packstücke beliebig für form- oder kraftschlüssige Sicherungsmaßnahmen eignen, sollte bei der Auslegung von Verpackungsmaßnahmen insbesondere der hilfreichen Wirkung von Reibungskräften zur Reduzierung zusätzlicher Sicherungsmaßnahmen Aufmerksamkeit gewidmet werden.
The Impact of Western Social Workers in Romania - a Fine Line between Empowerment and Disempowerment
Resumo:
Ideally the social work profession promotes social change, problem solving in human relationships and the empowerment and liberation of people to enhance their well-being (IFSW 2004). The social work practice, however, often proves to be different. Social workers are always in the danger to make decisions for their clients or define problems according to their own interpretation and world view. In quite a number of cases, the consequence of such a social work practice is that the clients feel disempowered rather than empowered. This dilemma is multiplying when western social workers get involved in developing countries. The potential that intervention, with the intention to empower and liberate the people, turns into disempowerment is tremendously higher because of the differences in tradition, culture and society, on the one side and the power imbalance between the ‘West’ and the ‘Rest’ on the other side. Especially in developing countries, where the vast majority of people live in poverty, many Western social workers come with a lot of sympathy and the idea to help the poor and to change the world. An example is Romania. After the collapse of communism in 1989, Romania was an economically, politically and socially devastated country. The pictures of the orphanages shocked the western world. As a result many Non-Governmental Organisations (NGOs), churches and individuals were bringing humanitarian goods to Romania in order to alleviate the misery of the Romanian people and especially the children. Since then, important changes in all areas of life have occurred, mostly with foreign financial aid and support. At the political level, democratic institutions were established, a liberal market economy was launched and laws were adapted to western standards regarding the accession into the European Union and the NATO. The western world has left its marks also at the grassroots level in form of NGOs or social service agencies established through western grants and individuals. Above and beyond, the presence of western goods and investment in Romania is omnipresent. This reflects a newly-gained freedom and prosperity - Romania profits certainly from these changes. But this is only one side of the medal, as the effect of westernisation contradicts with the Romanian reality and overruns many deep-rooted traditions, thus the majority of people. Moreover, only a small percentage of the population has access to this western world. Western concepts, procedures or interpretations are often highly differing from the Romanian tradition, history and culture. Nevertheless, western ideas seem to dominate the transition in many areas of daily life in Romania. A closer look reveals that many changes take place due to pressure of western governments and are conditioned to financial support. The dialectic relationship between the need for foreign aid and the implementation becomes very obvious in Romania and often leads, despite the substantial benefits, to unpredictable and rather negative side-effects, at a political, social, cultural, ecological and/or economic level. This reality is a huge dilemma for all those involved, as there is a fine line between empowering and disempowering action. It is beyond the scope of this journal to discuss the dilemma posed by Western involvement at all levels; therefore this article focuses on the impact of Western social workers in Romania. The first part consists of a short introduction to social work in Romania, followed by the discussion about the dilemma posed by the structure of project of international social work and the organisation of private social service agencies. Thirdly the experiences of Romanian staff with Western social workers are presented and then discussed with regard to turning disempowering tendencies of Western social workers into empowerment.
Resumo:
In February 1962, Hamburg experienced its most catastrophic storm surge event of the 20th century. This paper analyses the event using the Twentieth Century Reanalysis (20CR) dataset. Responsible for the major flood was a strong low pressure system centred over Scandinavia that was associated with strong north-westerly winds towards the German North Sea coast – the ideal storm surge situation for the Elbe estuary. A comparison of the 20CR dataset with observational data proves the applicability of the reanalysis data for this extreme event.
Resumo:
Vielfach konnte in den letzten Jahren die Bedeutung einer langen letzten Fixation vor Bewegungsbeginn – des sogenannten "Quiet Eye" – für die sportmotorische Leistung aufgezeigt werden. Obgleich dieses Phänomens breit untersucht wurde, mangelt es bislang an einer zufriedenstellenden Erklärung. In diesem Beitrag werden daher aktuelle Erklärungsversuche diskutiert. Es zeigt sich, dass vorliegende Beiträge aus der Kognitions- und der ökologischen Psychologie konzeptuelle oder methodische Mängel aufweisen. Aus diesen Gründen wird – zunächst für Aufgaben mit hohen Präzisionsanforderungen – ein Inhibitionsmechanismus zur Erklärung des Quiet-Eye-Phänomens vorgeschlagen mit der zentralen Aussage, dass die Verarbeitung leistungsrelevanter Hinweisreize durch ein "ruhiges Auge" von Störungen abgeschirmt wird. Abschließend kann gezeigt werden, dass sich der vorgeschlagene Mechanismus mit der bestehenden Befundlage als kompatibel erweist und er die Ableitung weitergehender Vorhersagen erlaubt.
Resumo:
Only a few sites in the Alps have produced archaeological finds from melting ice. To date, prehistoric finds from four sites dating from the Neolithic period, the Bronze Age, and the Iron Age have been recovered from small ice patches (Schnidejoch, Lötschenpass, Tisenjoch, and Gemsbichl/Rieserferner). Glaciers, on the other hand, have yielded historic finds and frozen human remains that are not more than a few hundred years old (three glacier mummies from the 16th to the 19th century and military finds from World Wars I and II). Between 2003 and 2010, numerous archaeological finds were recovered from a melting ice patch on the Schnidejoch in the Bernese Alps (Cantons of Berne and Valais, Switzerland). These finds date from the Neolithic period, the Early Bronze Age, the Iron Age, Roman times, and the Middle Ages, spanning a period of 6000 years. The Schnidejoch, at an altitude of 2756 m asl, is a pass in the Wildhorn region of the western Bernese Alps. It has yielded some of the earliest evidence of Neolithic human activity at high altitude in the Alps. The abundant assemblage of finds contains a number of unique artifacts, mainly from organic materials like leather, wood, bark, and fibers. The site clearly proves access to high-mountain areas as early as the 5th millennium BC, and the chronological distribution of the finds indicates that the Schnidejoch pass was used mainly during periods when glaciers were retreating.
Resumo:
The first section of this chapter starts with the Buffon problem, which is one of the oldest in stochastic geometry, and then continues with the definition of measures on the space of lines. The second section defines random closed sets and related measurability issues, explains how to characterize distributions of random closed sets by means of capacity functionals and introduces the concept of a selection. Based on this concept, the third section starts with the definition of the expectation and proves its convexifying effect that is related to the Lyapunov theorem for ranges of vector-valued measures. Finally, the strong law of large numbers for Minkowski sums of random sets is proved and the corresponding limit theorem is formulated. The chapter is concluded by a discussion of the union-scheme for random closed sets and a characterization of the corresponding stable laws.