894 resultados para Unified Model Reference
Resumo:
In this note, we shortly survey some recent approaches on the approximation of the Bayes factor used in Bayesian hypothesis testing and in Bayesian model choice. In particular, we reassess importance sampling, harmonic mean sampling, and nested sampling from a unified perspective.
Resumo:
Digital elevation models (DEMs) have been an important topic in geography and surveying sciences for decades due to their geomorphological importance as the reference surface for gravita-tion-driven material flow, as well as the wide range of uses and applications. When DEM is used in terrain analysis, for example in automatic drainage basin delineation, errors of the model collect in the analysis results. Investigation of this phenomenon is known as error propagation analysis, which has a direct influence on the decision-making process based on interpretations and applications of terrain analysis. Additionally, it may have an indirect influence on data acquisition and the DEM generation. The focus of the thesis was on the fine toposcale DEMs, which are typically represented in a 5-50m grid and used in the application scale 1:10 000-1:50 000. The thesis presents a three-step framework for investigating error propagation in DEM-based terrain analysis. The framework includes methods for visualising the morphological gross errors of DEMs, exploring the statistical and spatial characteristics of the DEM error, making analytical and simulation-based error propagation analysis and interpreting the error propagation analysis results. The DEM error model was built using geostatistical methods. The results show that appropriate and exhaustive reporting of various aspects of fine toposcale DEM error is a complex task. This is due to the high number of outliers in the error distribution and morphological gross errors, which are detectable with presented visualisation methods. In ad-dition, the use of global characterisation of DEM error is a gross generalisation of reality due to the small extent of the areas in which the decision of stationarity is not violated. This was shown using exhaustive high-quality reference DEM based on airborne laser scanning and local semivariogram analysis. The error propagation analysis revealed that, as expected, an increase in the DEM vertical error will increase the error in surface derivatives. However, contrary to expectations, the spatial au-tocorrelation of the model appears to have varying effects on the error propagation analysis depend-ing on the application. The use of a spatially uncorrelated DEM error model has been considered as a 'worst-case scenario', but this opinion is now challenged because none of the DEM derivatives investigated in the study had maximum variation with spatially uncorrelated random error. Sig-nificant performance improvement was achieved in simulation-based error propagation analysis by applying process convolution in generating realisations of the DEM error model. In addition, typology of uncertainty in drainage basin delineations is presented.
Resumo:
Objectives: 1. Estimate population parameters required for a management model. These include survival, density, age structure, growth, age and size at maturity and at recruitment to the adult eel fishery. Estimate their variability among individuals in a range of habitats. 2. Develop a management population dynamics model and use it to investigate management options. 3. Establish baseline data and sustainability indicators for long-term monitoring. 4. Assess the applicability of the above techniques to other eel fisheries in Australia, in collaboration with NSW. Distribute developed tools via the Australia and New Zealand Eel Reference Group.
A Legendre spectral element model for sloshing and acoustic analysis in nearly incompressible fluids
Resumo:
A new spectral finite element formulation is presented for modeling the sloshing and the acoustic waves in nearly incompressible fluids. The formulation makes use of the Legendre polynomials in deriving the finite element interpolation shape functions in the Lagrangian frame of reference. The formulated element uses Gauss-Lobatto-Legendre quadrature scheme for integrating the volumetric stiffness and the mass matrices while the conventional Gauss-Legendre quadrature scheme is used on the rotational stiffness matrix to completely eliminate the zero energy modes, which are normally associated with the Lagrangian FE formulation. The numerical performance of the spectral element formulated here is examined by doing the inf-sup test oil a standard rectangular rigid tank partially filled with liquid The eigenvalues obtained from the formulated spectral element are compared with the conventional equally spaced node locations of the h-type Lagrangian finite element and the predicted results show that these spectral elements are more accurate and give superior convergence The efficiency and robustness of the formulated elements are demonstrated by solving few standard problems involving free vibration and dynamic response analysis with undistorted and distorted spectral elements. and the obtained results are compared with available results in the published literature (C) 2009 Elsevier Inc All rights reserved
Resumo:
Numerical models, used for atmospheric research, weather prediction and climate simulation, describe the state of the atmosphere over the heterogeneous surface of the Earth. Several fundamental properties of atmospheric models depend on orography, i.e. on the average elevation of land over a model area. The higher is the models' resolution, the more the details of orography directly influence the simulated atmospheric processes. This sets new requirements for the accuracy of the model formulations with respect to the spatially varying orography. Orography is always averaged, representing the surface elevation within the horizontal resolution of the model. In order to remove the smallest scales and steepest slopes, the continuous spectrum of orography is normally filtered (truncated) even more, typically beyond a few gridlengths of the model. This means, that in the numerical weather prediction (NWP) models, there will always be subgridscale orography effects, which cannot be explicitly resolved by numerical integration of the basic equations, but require parametrization. In the subgrid-scale, different physical processes contribute in different scales. The parametrized processes interact with the resolved-scale processes and with each other. This study contributes to building of a consistent, scale-dependent system of orography-related parametrizations for the High Resolution Limited Area Model (HIRLAM). The system comprises schemes for handling the effects of mesoscale (MSO) and small-scale (SSO) orographic effects on the simulated flow and a scheme of orographic effects on the surface-level radiation fluxes. Representation of orography, scale-dependencies of the simulated processes and interactions between the parametrized and resolved processes are discussed. From the high-resolution digital elevation data, orographic parameters are derived for both momentum and radiation flux parametrizations. Tools for diagnostics and validation are developed and presented. The parametrization schemes applied, developed and validated in this study, are currently being implemented into the reference version of HIRLAM.
Resumo:
The problem of constructing space-time (ST) block codes over a fixed, desired signal constellation is considered. In this situation, there is a tradeoff between the transmission rate as measured in constellation symbols per channel use and the transmit diversity gain achieved by the code. The transmit diversity is a measure of the rate of polynomial decay of pairwise error probability of the code with increase in the signal-to-noise ratio (SNR). In the setting of a quasi-static channel model, let n(t) denote the number of transmit antennas and T the block interval. For any n(t) <= T, a unified construction of (n(t) x T) ST codes is provided here, for a class of signal constellations that includes the familiar pulse-amplitude (PAM), quadrature-amplitude (QAM), and 2(K)-ary phase-shift-keying (PSK) modulations as special cases. The construction is optimal as measured by the rate-diversity tradeoff and can achieve any given integer point on the rate-diversity tradeoff curve. An estimate of the coding gain realized is given. Other results presented here include i) an extension of the optimal unified construction to the multiple fading block case, ii) a version of the optimal unified construction in which the underlying binary block codes are replaced by trellis codes, iii) the providing of a linear dispersion form for the underlying binary block codes, iv) a Gray-mapped version of the unified construction, and v) a generalization of construction of the S-ary case corresponding to constellations of size S-K. Items ii) and iii) are aimed at simplifying the decoding of this class of ST codes.
Resumo:
An approach, starting with the bubble formation model of Khurana and Khumar, has been presented, which is found to be reasonably applicable to the formation of both bubbles and drops from single submerged nozzles. The model treats both the phenomena jointly as the formation of a dispersed phase entity resulting from injection, whose size depends upon operating parameters and physical properties.
Resumo:
The ProFacil model is a generic process model defined as a framework model showing the links between the facilities management process and the building end user’s business process. The purpose of using the model is to support more detailed process modelling. The model has been developed using the IDEF0 modelling method. The ProFacil model describes business activities from the generalized point of view as management-, support-, and core processes and their relations. The model defines basic activities in the provision of a facility. Examples of these activities are “operate facilities”, “provide new facilities”, “provide re-build facilities”, “provide maintained facilities” and “perform dispose of facilities”. These are all generic activities providing a basis for a further specialisation of company specific FM activities and their tasks. A facilitator can establish a specialized process model using the ProFacil model and interacting with company experts to describe their company’s specific processes. These modelling seminars or interviews will be done in an informal way, supported by the high-level process model as a common reference.
Resumo:
The open development model of software production has been characterized as the future model of knowledge production and distributed work. Open development model refers to publicly available source code ensured by an open source license, and the extensive and varied distributed participation of volunteers enabled by the Internet. Contemporary spokesmen of open source communities and academics view open source development as a new form of volunteer work activity characterized by hacker ethic and bazaar governance . The development of the Linux operating system is perhaps the best know example of such an open source project. It started as an effort by a user-developer and grew quickly into a large project with hundreds of user-developer as contributors. However, in hybrids , in which firms participate in open source projects oriented towards end-users, it seems that most users do not write code. The OpenOffice.org project, initiated by Sun Microsystems, in this study represents such a project. In addition, the Finnish public sector ICT decision-making concerning open source use is studied. The purpose is to explore the assumptions, theories and myths related to the open development model by analysing the discursive construction of the OpenOffice.org community: its developers, users and management. The qualitative study aims at shedding light on the dynamics and challenges of community construction and maintenance, and related power relations in hybrid open source, by asking two main research questions: How is the structure and membership constellation of the community, specifically the relation between developers and users linguistically constructed in hybrid open development? What characterizes Internet-mediated virtual communities and how can they be defined? How do they differ from hierarchical forms of knowledge production on one hand and from traditional volunteer communities on the other? The study utilizes sociological, psychological and anthropological concepts of community for understanding the connection between the real and the imaginary in so-called virtual open source communities. Intermediary methodological and analytical concepts are borrowed from discourse and rhetorical theories. A discursive-rhetorical approach is offered as a methodological toolkit for studying texts and writing in Internet communities. The empirical chapters approach the problem of community and its membership from four complementary points of views. The data comprises mailing list discussion, personal interviews, web page writings, email exchanges, field notes and other historical documents. The four viewpoints are: 1) the community as conceived by volunteers 2) the individual contributor s attachment to the project 3) public sector organizations as users of open source 4) the community as articulated by the community manager. I arrive at four conclusions concerning my empirical studies (1-4) and two general conclusions (5-6). 1) Sun Microsystems and OpenOffice.org Groupware volunteers failed in developing necessary and sufficient open code and open dialogue to ensure collaboration thus splitting the Groupware community into volunteers we and the firm them . 2) Instead of separating intrinsic and extrinsic motivations, I find that volunteers unique patterns of motivations are tied to changing objects and personal histories prior and during participation in the OpenOffice.org Lingucomponent project. Rather than seeing volunteers as a unified community, they can be better understood as independent entrepreneurs in search of a collaborative community . The boundaries between work and hobby are blurred and shifting, thus questioning the usefulness of the concept of volunteer . 3) The public sector ICT discourse portrays a dilemma and tension between the freedom to choose, use and develop one s desktop in the spirit of open source on one hand and the striving for better desktop control and maintenance by IT staff and user advocates, on the other. The link between the global OpenOffice.org community and the local end-user practices are weak and mediated by the problematic IT staff-(end)user relationship. 4) Authoring community can be seen as a new hybrid open source community-type of managerial practice. The ambiguous concept of community is a powerful strategic tool for orienting towards multiple real and imaginary audiences as evidenced in the global membership rhetoric. 5) The changing and contradictory discourses of this study show a change in the conceptual system and developer-user relationship of the open development model. This change is characterized as a movement from hacker ethic and bazaar governance to more professionally and strategically regulated community. 6) Community is simultaneously real and imagined, and can be characterized as a runaway community . Discursive-action can be seen as a specific type of online open source engagement. Hierarchies and structures are created through discursive acts. Key words: Open Source Software, open development model, community, motivation, discourse, rhetoric, developer, user, end-user
Resumo:
Quantum cell models for delocalized electrons provide a unified approach to the large NLO responses of conjugated polymers and pi-pi* spectra of conjugated molecules. We discuss exact NLO coefficients of infinite chains with noninteracting pi-electrons and finite chains with molecular Coulomb interactions V(R) in order to compare exact and self-consistent-field results, to follow the evolution from molecular to polymeric responses, and to model vibronic contributions in third-harmonic-generation spectra. We relate polymer fluorescence to the alternation delta of transfer integrals t(1+/-delta) along the chain and discuss correlated excited states and energy thresholds of conjugated polymers.
Resumo:
In this paper analytical expressions for optimal Vdd and Vth to minimize energy for a given speed constraint are derived. These expressions are based on the EKV model for transistors and are valid in both strong inversion and sub threshold regions. The effect of gate leakage on the optimal Vdd and Vth is analyzed. A new gradient based algorithm for controlling Vdd and Vth based on delay and power monitoring results is proposed. A Vdd-Vth controller which uses the algorithm to dynamically control the supply and threshold voltage of a representative logic block (sum of absolute difference computation of an MPEG decoder) is designed. Simulation results using 65 nm predictive technology models are given.
Resumo:
SecB is a homotetrameric cytosolic chaperone that forms part of the protein translocation machinery in E. coli. Due to SecB, nascent polypeptides are maintained in an unfolded translocation-competent state devoid of tertiary structure and thus are guided to the translocon. In vitro SecB rapidly binds to a variety of ligands in a non-native state. We have previously investigated the bound state conformation of the model substrate bovine pancreatic trypsin inhibitor (BPTI) as well as the conformation of SecB itself by using proximity relationships based on site-directed spin labeling and pyrene fluorescence methods. It was shown that SecB undergoes a conformational change during the process of substrate binding. Here, we generated SecB mutants containing but a single cysteine per subunit or an exposed highly reactive new cysteine after removal of the nearby intrinsic cysteines. Quantitative spin labeling was achieved with the methanethiosulfonate spin label (MTS) at positions C97 or E90C, respectively. Highfield (W-band) electron paramagnetic resonance (EPR) measurements revealed that with BPTI present the spin labels are exposed to a more polar/hydrophilic environment. Nanoscale distance measurements with double electron-electron resonance (DEER) were in excellent agreement with distances obtained by molecular modeling. Binding of BPTI also led to a slight change in distances between labels at C97 but not at E90C. While the shorter distance in the tetramer increased, the larger diagonal distance decreased. These findings can be explained by a widening of the tetrameric structure upon substrate binding much like the opening of two pairs of scissors.
Resumo:
The fluctuating force model is developed and applied to the turbulent flow of a gas-particle suspension in a channel in the limit of high Stokes number, where the particle relaxation time is large compared to the fluid correlation time, and low particle Reynolds number where the Stokes drag law can be used to describe the interaction between the particles and fluid. In contrast to the Couette flow, the fluid velocity variances in the different directions in the channel are highly non-homogeneous, and they exhibit significant variation across the channel. First, we analyse the fluctuating particle velocity and acceleration distributions at different locations across the channel. The distributions are found to be non-Gaussian near the centre of the channel, and they exhibit significant skewness and flatness. However, acceleration distributions are closer to Gaussian at locations away from the channel centre, especially in regions where the variances of the fluid velocity fluctuations are at a maximum. The time correlations for the fluid velocity fluctuations and particle acceleration fluctuations are evaluated, and it is found that the time correlation of the particle acceleration fluctuations is close to the time correlations of the fluid velocity in a `moving Eulerian' reference, moving with the mean fluid velocity. The variances of the fluctuating force distributions in the Langevin simulations are determined from the time correlations of the fluid velocity fluctuations and the results are compared with direct numerical simulations. Quantitative agreement between the two simulations are obtained provided the particle viscous relaxation time is at least five times larger than the fluid integral time.
Resumo:
Spectral efficiency is a key characteristic of cellular communications systems, as it quantifies how well the scarce spectrum resource is utilized. It is influenced by the scheduling algorithm as well as the signal and interference statistics, which, in turn, depend on the propagation characteristics. In this paper we derive analytical expressions for the short-term and long-term channel-averaged spectral efficiencies of the round robin, greedy Max-SINR, and proportional fair schedulers, which are popular and cover a wide range of system performance and fairness trade-offs. A unified spectral efficiency analysis is developed to highlight the differences among these schedulers. The analysis is different from previous work in the literature in the following aspects: (i) it does not assume the co-channel interferers to be identically distributed, as is typical in realistic cellular layouts, (ii) it avoids the loose spectral efficiency bounds used in the literature, which only considered the worst case and best case locations of identical co-channel interferers, (iii) it explicitly includes the effect of multi-tier interferers in the cellular layout and uses a more accurate model for handling the total co-channel interference, and (iv) it captures the impact of using small modulation constellation sizes, which are typical of cellular standards. The analytical results are verified using extensive Monte Carlo simulations.
Resumo:
Electrical failure of insulation is known to be an extremal random process wherein nominally identical pro-rated specimens of equipment insulation, at constant stress fail at inordinately different times even under laboratory test conditions. In order to be able to estimate the life of power equipment, it is necessary to run long duration ageing experiments under accelerated stresses, to acquire and analyze insulation specific failure data. In the present work, Resin Impregnated Paper (RIP) a relatively new insulation system of choice used in transformer bushings, is taken as an example. The failure data has been processed using proven statistical methods, both graphical and analytical. The physical model governing insulation failure at constant accelerated stress has been assumed to be based on temperature dependent inverse power law model.