966 resultados para quantifying heteroskedasticity


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Functional linkage between reef habitat quality and fish growth and production has remained elusive. Most current research is focused on correlative relationships between a general habitat type and presence/absence of a species, an index of species abundance, or species diversity. Such descriptive information largely ignores how reef attributes regulate reef fish abundance (density-dependent habitat selection), trophic interactions, and physiological performance (growth and condition). To determine the functional relationship between habitat quality, fish abundance, trophic interactions, and physiological performance, we are using an experimental reef system in the northeastern Gulf of Mexico where we apply advanced sensor and biochemical technologies. Our study site controls for reef attributes (size, cavity space, and reef mosaics) and focuses on the processes that regulate gag grouper (Mycteroperca microlepis) abundance, behavior and performance (growth and condition), and the availability of their pelagic prey. We combine mobile and fixed-active (fisheries) acoustics, passive acoustics, video cameras, and advanced biochemical techniques. Fisheries acoustics quantifies the abundance of pelagic prey fishes associated with the reefs and their behavior. Passive acoustics and video allow direct observation of gag and prey fish behavior and the acoustic environment, and provide a direct visual for the interpretation of fixed fisheries acoustics measurements. New application of biochemical techniques, such as Electron Transport System (ETS) assay, allow the in situ measurement of metabolic expenditure of gag and relates this back to reef attributes, gag behavior, and prey fish availability. Here, we provide an overview of our integrated technological approach for understanding and quantifying the functional relationship between reef habitat quality and one element of production – gag grouper growth on shallow coastal reefs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Alliance for Coastal Technologies (ACT) convened a workshop on Evaluating Approaches and Technologies for Monitoring Organic Contaminants in the Aquatic Environment in Ann Arbor, MI on July 21-23, 2006. The primary objectives of this workshop were to: 1) identify the priority management information needs relative to organic contaminant loading; 2) explore the most appropriate approaches to estimating mass loading; and 3) evaluate the current status of the sensor technology. To meet these objectives, a mixture of leading research scientists, resource managers, and industry representatives were brought together for a focused two-day workshop. The workshop featured four plenary talks followed by breakout sessions in which arranged groups of participants where charged to respond to a series of focused discussion questions. At present, there are major concerns about the inadequacies in approaches and technologies for quantifying mass emissions and detection of organic contaminants for protecting municipal water supplies and receiving waters. Managers use estimates of land-based contaminant loadings to rivers, lakes, and oceans to assess relative risk among various contaminant sources, determine compliance with regulatory standards, and define progress in source reduction. However, accurately quantifying contaminant loading remains a major challenge. Loading occurs over a range of hydrologic conditions, requiring measurement technologies that can accommodate a broad range of ambient conditions. In addition, in situ chemical sensors that provide a means for acquiring continuous concentration measurements are still under development, particularly for organic contaminants that typically occur at low concentrations. Better approaches and strategies for estimating contaminant loading, including evaluations of both sampling design and sensor technologies, need to be identified. The following general recommendations were made in an effort to advance future organic contaminant monitoring: 1. Improve the understanding of material balance in aquatic systems and the relationship between potential surrogate measures (e.g., DOC, chlorophyll, particle size distribution) and target constituents. 2. Develop continuous real-time sensors to be used by managers as screening measures and triggers for more intensive monitoring. 3. Pursue surrogate measures and indicators of organic pollutant contamination, such as CDOM, turbidity, or non-equilibrium partitioning. 4. Develop continuous field-deployable sensors for PCBs, PAHs, pyrethroids, and emerging contaminants of concern and develop strategies that couple sampling approaches with tools that incorporate sensor synergy (i.e., measure appropriate surrogates along with the dissolved organics to allow full mass emission estimation).[PDF contains 20 pages]

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Alliance for Coastal Technologies (ACT) held a Workshop on Sensor Technology for Assessing Groundwater-Surface Water Interactions in the Coastal Zone on March 7 to 9,2005 in Savannah, GA. The main goal of the workshop was to summarize the general parameters, which have been found to be useful in assessing groundwater-surface water (GW-SW) interactions in the coastal zone. The workshop participants (Appendix I) were specifically charged with identifying the types of sensor systems, if any, that have been used to obtain time-series data and to make known which parameters may be the most amenable to the development/application of sensor technology. The group consisted of researchers, industry representatives, and environmental managers. Four general recommendations were made: 1. Educate coastal managers and agencies on the importance of GW-SW interactions, keeping in mind that regulatory agencies are driven by a different set of rules than researchers: the focus is on understanding the significance of the problem and providing solutions. ACT could facilitate this process in two ways. First, given that the research literature on this subject is fairly diffuse, ACT could provide links from its web site to fact sheets or other literature. Second, ACT could organize a focused meeting for managers and/or agency groups. Encourage development of primary tools for quantifying flow. The most promising technology in this respect is flow meters designed for flux chambers, mainly because they should be simple to use and can be made relatively inexpensively. However, it should be kept in mind that they provide only point measurements and several would need to be deployed as a network in order to obtain reliable flow estimates. For evaluating system wide GW-SW interactions, tools that integrate the signal over large areas would be required. Suggestions include a user-friendly hydrogeologic models, keeping in mind that freshwater flow is not the entire story, or continuous radon monitors. Though the latter would be slightly more difficult to use in terms of background knowledge, such an instrument would be low power and easy to operate and maintain. ACT could facilitate this recommendation by identifying funding opportunities on its web site and/or performing evaluations of existing technologies that could be summarized on the web site. (pdf contains 18 pages)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In order to restore the balance between available fish res-sources and catch capacities in the marine waters of the EU, the European Commission has introduced so-called Multiannual Guidance Programmes (MAGPs) within the frame work of the Common Fisheries Policy (CFP). However, the non-quantified relation between fishing effort and fishing power of a vessel has proved to be one of the most difficult problems. The present contribution suggests to substitute traditional but non-quantifying methods by including the real catch results into the models.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Thermal fluctuation approach is widely used to monitor association kinetics of surface-bound receptor-ligand interactions. Various protocols such as sliding standard deviation (SD) analysis (SSA) and Page's test analysis (PTA) have been used to estimate two-dimensional (2D) kinetic rates from the time course of displacement of molecular carrier. In the current work, we compared the estimations from both SSA and modified PTA using measured data from an optical trap assay and simulated data from a random number generator. Our results indicated that both SSA and PTA were reliable in estimating 2D kinetic rates. Parametric analysis also demonstrated that such the estimations were sensitive to parameters such as sampling rate, sliding window size, and threshold. These results furthered the understandings in quantifying the biophysics of receptor-ligand interactions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The forces cells apply to their surroundings control biological processes such as growth, adhesion, development, and migration. In the past 20 years, a number of experimental techniques have been developed to measure such cell tractions. These approaches have primarily measured the tractions applied by cells to synthetic two-dimensional substrates, which do not mimic in vivo conditions for most cell types. Many cell types live in a fibrous three-dimensional (3D) matrix environment. While studying cell behavior in such 3D matrices will provide valuable insights for the mechanobiology and tissue engineering communities, no experimental approaches have yet measured cell tractions in a fibrous 3D matrix.

This thesis describes the development and application of an experimental technique for quantifying cellular forces in a natural 3D matrix. Cells and their surrounding matrix are imaged in three dimensions with high speed confocal microscopy. The cell-induced matrix displacements are computed from the 3D image volumes using digital volume correlation. The strain tensor in the 3D matrix is computed by differentiating the displacements, and the stress tensor is computed by applying a constitutive law. Finally, tractions applied by the cells to the matrix are computed directly from the stress tensor.

The 3D traction measurement approach is used to investigate how cells mechanically interact with the matrix in biologically relevant processes such as division and invasion. During division, a single mother cell undergoes a drastic morphological change to split into two daughter cells. In a 3D matrix, dividing cells apply tensile force to the matrix through thin, persistent extensions that in turn direct the orientation and location of the daughter cells. Cell invasion into a 3D matrix is the first step required for cell migration in three dimensions. During invasion, cells initially apply minimal tractions to the matrix as they extend thin protrusions into the matrix fiber network. The invading cells anchor themselves to the matrix using these protrusions, and subsequently pull on the matrix to propel themselves forward.

Lastly, this thesis describes a constitutive model for the 3D fibrous matrix that uses a finite element (FE) approach. The FE model simulates the fibrous microstructure of the matrix and matches the cell-induced matrix displacements observed experimentally using digital volume correlation. The model is applied to predict how cells mechanically sense one another in a 3D matrix. It is found that cell-induced matrix displacements localize along linear paths. These linear paths propagate over a long range through the fibrous matrix, and provide a mechanism for cell-cell signaling and mechanosensing. The FE model developed here has the potential to reveal the effects of matrix density, inhomogeneity, and anisotropy in signaling cell behavior through mechanotransduction.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The two most important digital-system design goals today are to reduce power consumption and to increase reliability. Reductions in power consumption improve battery life in the mobile space and reductions in energy lower operating costs in the datacenter. Increased robustness and reliability shorten down time, improve yield, and are invaluable in the context of safety-critical systems. While optimizing towards these two goals is important at all design levels, optimizations at the circuit level have the furthest reaching effects; they apply to all digital systems. This dissertation presents a study of robust minimum-energy digital circuit design and analysis. It introduces new device models, metrics, and methods of calculation—all necessary first steps towards building better systems—and demonstrates how to apply these techniques. It analyzes a fabricated chip (a full-custom QDI microcontroller designed at Caltech and taped-out in 40-nm silicon) by calculating the minimum energy operating point and quantifying the chip’s robustness in the face of both timing and functional failures.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The epidemic of HIV/AIDS in the United States is constantly changing and evolving, starting from patient zero to now an estimated 650,000 to 900,000 Americans infected. The nature and course of HIV changed dramatically with the introduction of antiretrovirals. This discourse examines many different facets of HIV from the beginning where there wasn't any treatment for HIV until the present era of highly active antiretroviral therapy (HAART). By utilizing statistical analysis of clinical data, this paper examines where we were, where we are and projections as to where treatment of HIV/AIDS is headed.

Chapter Two describes the datasets that were used for the analyses. The primary database utilized was collected by myself from an outpatient HIV clinic. The data included dates from 1984 until the present. The second database was from the Multicenter AIDS Cohort Study (MACS) public dataset. The data from the MACS cover the time between 1984 and October 1992. Comparisons are made between both datasets.

Chapter Three discusses where we were. Before the first anti-HIV drugs (called antiretrovirals) were approved, there was no treatment to slow the progression of HIV. The first generation of antiretrovirals, reverse transcriptase inhibitors such as AZT (zidovudine), DDI (didanosine), DDC (zalcitabine), and D4T (stavudine) provided the first treatment for HIV. The first clinical trials showed that these antiretrovirals had a significant impact on increasing patient survival. The trials also showed that patients on these drugs had increased CD4+ T cell counts. Chapter Three examines the distributions of CD4 T cell counts. The results show that the estimated distributions of CD4 T cell counts are distinctly non-Gaussian. Thus distributional assumptions regarding CD4 T cell counts must be taken, into account when performing analyses with this marker. The results also show the estimated CD4 T cell distributions for each disease stage: asymptomatic, symptomatic and AIDS are non-Gaussian. Interestingly, the distribution of CD4 T cell counts for the asymptomatic period is significantly below that of the CD4 T cell distribution for the uninfected population suggesting that even in patients with no outward symptoms of HIV infection, there exists high levels of immunosuppression.

Chapter Four discusses where we are at present. HIV quickly grew resistant to reverse transcriptase inhibitors which were given sequentially as mono or dual therapy. As resistance grew, the positive effects of the reverse transcriptase inhibitors on CD4 T cell counts and survival dissipated. As the old era faded a new era characterized by a new class of drugs and new technology changed the way that we treat HIV-infected patients. Viral load assays were able to quantify the levels of HIV RNA in the blood. By quantifying the viral load, one now had a faster, more direct way to test antiretroviral regimen efficacy. Protease inhibitors, which attacked a different region of HIV than reverse transcriptase inhibitors, when used in combination with other antiretroviral agents were found to dramatically and significantly reduce the HIV RNA levels in the blood. Patients also experienced significant increases in CD4 T cell counts. For the first time in the epidemic, there was hope. It was hypothesized that with HAART, viral levels could be kept so low that the immune system as measured by CD4 T cell counts would be able to recover. If these viral levels could be kept low enough, it would be possible for the immune system to eradicate the virus. The hypothesis of immune reconstitution, that is bringing CD4 T cell counts up to levels seen in uninfected patients, is tested in Chapter Four. It was found that for these patients, there was not enough of a CD4 T cell increase to be consistent with the hypothesis of immune reconstitution.

In Chapter Five, the effectiveness of long-term HAART is analyzed. Survival analysis was conducted on 213 patients on long-term HAART. The primary endpoint was presence of an AIDS defining illness. A high level of clinical failure, or progression to an endpoint, was found.

Chapter Six yields insights into where we are going. New technology such as viral genotypic testing, that looks at the genetic structure of HIV and determines where mutations have occurred, has shown that HIV is capable of producing resistance mutations that confer multiple drug resistance. This section looks at resistance issues and speculates, ceterus parabis, where the state of HIV is going. This section first addresses viral genotype and the correlates of viral load and disease progression. A second analysis looks at patients who have failed their primary attempts at HAART and subsequent salvage therapy. It was found that salvage regimens, efforts to control viral replication through the administration of different combinations of antiretrovirals, were not effective in 90 percent of the population in controlling viral replication. Thus, primary attempts at therapy offer the best change of viral suppression and delay of disease progression. Documentation of transmission of drug-resistant virus suggests that the public health crisis of HIV is far from over. Drug resistant HIV can sustain the epidemic and hamper our efforts to treat HIV infection. The data presented suggest that the decrease in the morbidity and mortality due to HIV/AIDS is transient. Deaths due to HIV will increase and public health officials must prepare for this eventuality unless new treatments become available. These results also underscore the importance of the vaccine effort.

The final chapter looks at the economic issues related to HIV. The direct and indirect costs of treating HIV/AIDS are very high. For the first time in the epidemic, there exists treatment that can actually slow disease progression. The direct costs for HAART are estimated. It is estimated that the direct lifetime costs for treating each HIV infected patient with HAART is between $353,000 to $598,000 depending on how long HAART prolongs life. If one looks at the incremental cost per year of life saved it is only $101,000. This is comparable with the incremental costs per year of life saved from coronary artery bypass surgery.

Policy makers need to be aware that although HAART can delay disease progression, it is not a cure and HIV is not over. The results presented here suggest that the decreases in the morbidity and mortality due to HIV are transient. Policymakers need to be prepared for the eventual increase in AIDS incidence and mortality. Costs associated with HIV/AIDS are also projected to increase. The cost savings seen recently have been from the dramatic decreases in the incidence of AIDS defining opportunistic infections. As patients who have been on HAART the longest start to progress to AIDS, policymakers and insurance companies will find that the cost of treating HIV/AIDS will increase.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The LIGO and Virgo gravitational-wave observatories are complex and extremely sensitive strain detectors that can be used to search for a wide variety of gravitational waves from astrophysical and cosmological sources. In this thesis, I motivate the search for the gravitational wave signals from coalescing black hole binary systems with total mass between 25 and 100 solar masses. The mechanisms for formation of such systems are not well-understood, and we do not have many observational constraints on the parameters that guide the formation scenarios. Detection of gravitational waves from such systems — or, in the absence of detection, the tightening of upper limits on the rate of such coalescences — will provide valuable information that can inform the astrophysics of the formation of these systems. I review the search for these systems and place upper limits on the rate of black hole binary coalescences with total mass between 25 and 100 solar masses. I then show how the sensitivity of this search can be improved by up to 40% by the the application of the multivariate statistical classifier known as a random forest of bagged decision trees to more effectively discriminate between signal and non-Gaussian instrumental noise. I also discuss the use of this classifier in the search for the ringdown signal from the merger of two black holes with total mass between 50 and 450 solar masses and present upper limits. I also apply multivariate statistical classifiers to the problem of quantifying the non-Gaussianity of LIGO data. Despite these improvements, no gravitational-wave signals have been detected in LIGO data so far. However, the use of multivariate statistical classification can significantly improve the sensitivity of the Advanced LIGO detectors to such signals.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

O presente estudo tem por objetivo demonstrar que, nas hipóteses em que alguém intervém na esfera jurídica alheia e obtém benefícios econômicos sem causar danos ao titular do direito ou, causando danos, o lucro obtido pelo ofensor é superior aos danos causados, as regras da responsabilidade civil, isoladamente, não são suficientes, à luz do ordenamento jurídico brasileiro, enquanto sanção eficaz pela violação de um interesse merecedor de tutela. Isto porque, como a principal função da responsabilidade civil é remover o dano, naquelas hipóteses, não fosse a utilização de um remédio alternativo, o interventor faria seu o lucro da intervenção, no primeiro caso integralmente e, no segundo, no valor equivalente ao saldo entre o lucro obtido e a indenização que tiver que pagar à vítima. A tese pretende demonstrar que o problema do lucro da intervenção não deve ser solucionado por intermédio das regras da responsabilidade civil, devendo, portanto, ser rejeitadas as propostas de solução neste campo, como a interpretação extensiva do parágrafo único, do artigo 944, do Código Civil, as indenizações punitivas e o chamado terceiro método de cálculo da indenização. Como alternativa, propõe-se o enquadramento dogmático do lucro da intervenção no enriquecimento sem causa, outorgando ao titular do direito uma pretensão de restituição do lucro obtido pelo ofensor em razão da indevida ingerência em seus bens ou direitos. Defende-se que a transferência do lucro da intervenção para o titular do direito tem por fundamento a ponderação dos interesses em jogo à luz da Constituição Federal, com especial atenção ao princípio da solidariedade, e da teoria da destinação jurídica dos bens. A tese procura demonstrar, ainda, que o ordenamento jurídico brasileiro não exige um efetivo empobrecimento do titular do direito para a configuração do enriquecimento sem causa e que a regra da subsidiariedade não impede a cumulação de ações, de responsabilidade civil para eliminar o dano (e no limite do dano), e de enriquecimento sem causa, para forçar a restituição do saldo positivo que permanecer no patrimônio do ofensor após o pagamento da indenização, se houver. Finalmente, a tese pretende provocar a discussão acerca da quantificação do objeto da restituição, propondo alguns critérios que deverão orientar o aplicador do direito.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The specific high energy and power capacities of rechargeable lithium metal (Li0) batteries are ideally suited to portable devices and are valuable as storage units for intermittent renewable energy sources. Lithium, the lightest and most electropositive metal, would be the optimal anode material for rechargeable batteries if it were not for the fact that such devices fail unexpectedly by short-circuiting via the dendrites that grow across electrodes upon recharging. This phenomenon poses a major safety issue because it triggers a series of adverse events that start with overheating, potentially followed by the thermal decomposition and ultimately the ignition of the organic solvents used in such devices.

In this thesis, we developed experimental platform for monitoring and quantifying the dendrite populations grown in a Li battery prototype upon charging under various conditions. We explored the effects of pulse charging in the kHz range and temperature on dendrite growth, and also on loss capacity into detached “dead” lithium particles.

Simultaneously, we developed a computational framework for understanding the dynamics of dendrite propagation. The coarse-grained Monte Carlo model assisted us in the interpretation of pulsing experiments, whereas MD calculations provided insights into the mechanism of dendrites thermal relaxation. We also developed a computational framework for measuring the dead lithium crystals from the experimental images.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article outlines the outcome of work that set out to provide one of the specified integral contributions to the overarching objectives of the EU- sponsored LIFE98 project described in this volume. Among others, these included a requirement to marry automatic monitoring and dynamic modelling approaches in the interests of securing better management of water quality in lakes and reservoirs. The particular task given to us was to devise the elements of an active management strategy for the Queen Elizabeth II Reservoir. This is one of the larger reservoirs supplying the population of the London area: after purification and disinfection, its water goes directly to the distribution network and to the consumers. The quality of the water in the reservoir is of primary concern, for the greater is the content of biogenic materials, including phytoplankton, then the more prolonged is the purification and the more expensive is the treatment. Whatever good that phytoplankton may do by way of oxygenation and oxidative purification, it is eventually relegated to an impurity that has to be removed from the final product. Indeed, it has been estimated that the cost of removing algae and microorganisms from water represents about one quarter of its price at the tap. In chemically fertile waters, such as those typifying the resources of the Thames Valley, there is thus a powerful and ongoing incentive to be able to minimise plankton growth in storage reservoirs. Indeed, the Thames Water company and its predecessor undertakings, have a long and impressive history of confronting and quantifying the fundamentals of phytoplankton growth in their reservoirs and of developing strategies for operation and design to combat them. The work to be described here follows in this tradition. However, the use of the model PROTECH-D to investigate present phytoplankton growth patterns in the Queen Elizabeth II Reservoir questioned the interpretation of some of the recent observations. On the other hand, it has reinforced the theories underpinning the original design of this and those Thames-Valley storage reservoirs constructed subsequently. The authors recount these experiences as an example of how simulation models can hone the theoretical base and its application to the practical problems of supplying water of good quality at economic cost, before the engineering is initiated.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Bio-orthogonal non-canonical amino acid tagging (BONCAT) is an analytical method that allows the selective analysis of the subset of newly synthesized cellular proteins produced in response to a biological stimulus. In BONCAT, cells are treated with the non-canonical amino acid L-azidohomoalanine (Aha), which is utilized in protein synthesis in place of methionine by wild-type translational machinery. Nascent, Aha-labeled proteins are selectively ligated to affinity tags for enrichment and subsequently identified via mass spectrometry. The work presented in this thesis exhibits advancements in and applications of the BONCAT technology that establishes it as an effective tool for analyzing proteome dynamics with time-resolved precision.

Chapter 1 introduces the BONCAT method and serves as an outline for the thesis as a whole. I discuss motivations behind the methodological advancements in Chapter 2 and the biological applications in Chapters 2 and 3.

Chapter 2 presents methodological developments that make BONCAT a proteomic tool capable of, in addition to identifying newly synthesized proteins, accurately quantifying rates of protein synthesis. I demonstrate that this quantitative BONCAT approach can measure proteome-wide patterns of protein synthesis at time scales inaccessible to alternative techniques.

In Chapter 3, I use BONCAT to study the biological function of the small RNA regulator CyaR in Escherichia coli. I correctly identify previously known CyaR targets, and validate several new CyaR targets, expanding the functional roles of the sRNA regulator.

In Chapter 4, I use BONCAT to measure the proteomic profile of the quorum sensing bacterium Vibrio harveyi during the time-dependent transition from individual- to group-behaviors. My analysis reveals new quorum-sensing-regulated proteins with diverse functions, including transcription factors, chemotaxis proteins, transport proteins, and proteins involved in iron homeostasis.

Overall, this work describes how to use BONCAT to perform quantitative, time-resolved proteomic analysis and demonstrates that these measurements can be used to study a broad range of biological processes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Measurements and modeling of Cu2Se, Ag2Se, and Cu2S show that superionic conductors have great potential as thermoelectric materials. Cu2Se and Ag2Se are predicted to reach a zT of 1.2 at room temperature if their carrier concentrations can be reduced, and Cu-vacancy doped Cu2S reaches a maximum zT of 1.7 at 1000 K. Te-doped Ag2Se achieves a zT of 1.2 at 520 K, and could reach a zT of 1.7 if its carrier concentration could be reduced. However, superionic conductors tend to have high carrier concentrations due to the presence of metal defects. The carrier concentration has been found to be difficult to reduce by altering the defect concentration, therefore materials that are underdoped relative to the optimum carrier concentration are easier to optimize. The results of Te-doping of Ag2Se show that reducing the carrier concentration is possible by reducing the maximum Fermi level in the material.

Two new methods for analyzing thermoelectric transport data were developed. The first involves scaling the temperature-dependent transport data according to the temperature dependences expected of a single parabolic band model and using all of the scaled data to perform a single parabolic band analysis, instead of being restricted to using one data point per sample at a fixed temperature. This allows for a more efficient use of the transport data. The second involves scaling only the Seebeck coefficient and electrical conductivity. This allows for an estimate of the quality factor (and therefore the maximum zT in the material) without using Hall effect data, which are not always available due to time and budget constraints and are difficult to obtain in high-resistivity materials. Methods for solving the coherent potential approximation effective medium equations were developed in conjunction with measurements of the resistivity tensor elements of composite materials. This allows the electrical conductivity and mobility of each phase in the composite to be determined from measurements of the bulk. This points out a new method for measuring the pure-phase electrical properties in impure materials, for measuring the electrical properties of unknown phases in composites, and for quantifying the effects of quantum interactions in composites.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this thesis we study the growth of a Li electrode-electrolyte interface in the presence of an elastic prestress. In particular, we focus our interest on Li-air batteries with a solid electrolyte, LIPON, which is a new type of secondary or rechargeable battery. Theoretical studies and experimental evidence show that during the process of charging the battery the replated lithium adds unevenly to the electrode surface. This phenomenon eventually leads to dendrite formation as the battery is charged and discharged numerous times. In order to suppress or alleviate this deleterious effect of dendrite growth, we put forth a study based on a linear stability analysis. Taking into account all the mechanisms of mass transport and interfacial kinetics, we model the evolution of the interface. We find that, in the absence of stress, the stability of a planar interface depends on interfacial diffusion properties and interfacial energy. Specifically, if Herring-Mullins capillarity-driven interfacial diffusion is accounted for, interfaces are unstable against all perturbations of wavenumber larger than a critical value. We find that the effect of an elastic prestress is always to stabilize planar interfacial growth by increasing the critical wavenumber for instability. A parametric study results in quantifying the extent of the prestress stabilization in a manner that can potentially be used in the design of Li-air batteries. Moreover, employing the theory of finite differences we numerically solve the equation that describes the evolution of the surface profile and present visualization results of the surface evolution by time. Lastly, numerical simulations performed in a commercial finite element software validate the theoretical formulation of the interfacial elastic energy change with respect to the planar interface.