829 resultados para Robust multidisciplinary
Resumo:
The immersed boundary method is a versatile tool for the investigation of flow-structure interaction. In a large number of applications, the immersed boundaries or structures are very stiff and strong tangential forces on these interfaces induce a well-known, severe time-step restriction for explicit discretizations. This excessive stability constraint can be removed with fully implicit or suitable semi-implicit schemes but at a seemingly prohibitive computational cost. While economical alternatives have been proposed recently for some special cases, there is a practical need for a computationally efficient approach that can be applied more broadly. In this context, we revisit a robust semi-implicit discretization introduced by Peskin in the late 1970s which has received renewed attention recently. This discretization, in which the spreading and interpolation operators are lagged. leads to a linear system of equations for the inter-face configuration at the future time, when the interfacial force is linear. However, this linear system is large and dense and thus it is challenging to streamline its solution. Moreover, while the same linear system or one of similar structure could potentially be used in Newton-type iterations, nonlinear and highly stiff immersed structures pose additional challenges to iterative methods. In this work, we address these problems and propose cost-effective computational strategies for solving Peskin`s lagged-operators type of discretization. We do this by first constructing a sufficiently accurate approximation to the system`s matrix and we obtain a rigorous estimate for this approximation. This matrix is expeditiously computed by using a combination of pre-calculated values and interpolation. The availability of a matrix allows for more efficient matrix-vector products and facilitates the design of effective iterative schemes. We propose efficient iterative approaches to deal with both linear and nonlinear interfacial forces and simple or complex immersed structures with tethered or untethered points. One of these iterative approaches employs a splitting in which we first solve a linear problem for the interfacial force and then we use a nonlinear iteration to find the interface configuration corresponding to this force. We demonstrate that the proposed approach is several orders of magnitude more efficient than the standard explicit method. In addition to considering the standard elliptical drop test case, we show both the robustness and efficacy of the proposed methodology with a 2D model of a heart valve. (C) 2009 Elsevier Inc. All rights reserved.
Resumo:
We present a variable time step, fully adaptive in space, hybrid method for the accurate simulation of incompressible two-phase flows in the presence of surface tension in two dimensions. The method is based on the hybrid level set/front-tracking approach proposed in [H. D. Ceniceros and A. M. Roma, J. Comput. Phys., 205, 391400, 2005]. Geometric, interfacial quantities are computed from front-tracking via the immersed-boundary setting while the signed distance (level set) function, which is evaluated fast and to machine precision, is used as a fluid indicator. The surface tension force is obtained by employing the mixed Eulerian/Lagrangian representation introduced in [S. Shin, S. I. Abdel-Khalik, V. Daru and D. Juric, J. Comput. Phys., 203, 493-516, 2005] whose success for greatly reducing parasitic currents has been demonstrated. The use of our accurate fluid indicator together with effective Lagrangian marker control enhance this parasitic current reduction by several orders of magnitude. To resolve accurately and efficiently sharp gradients and salient flow features we employ dynamic, adaptive mesh refinements. This spatial adaption is used in concert with a dynamic control of the distribution of the Lagrangian nodes along the fluid interface and a variable time step, linearly implicit time integration scheme. We present numerical examples designed to test the capabilities and performance of the proposed approach as well as three applications: the long-time evolution of a fluid interface undergoing Rayleigh-Taylor instability, an example of bubble ascending dynamics, and a drop impacting on a free interface whose dynamics we compare with both existing numerical and experimental data.
Resumo:
We propose a likelihood ratio test ( LRT) with Bartlett correction in order to identify Granger causality between sets of time series gene expression data. The performance of the proposed test is compared to a previously published bootstrapbased approach. LRT is shown to be significantly faster and statistically powerful even within non- Normal distributions. An R package named gGranger containing an implementation for both Granger causality identification tests is also provided.
A robust Bayesian approach to null intercept measurement error model with application to dental data
Resumo:
Measurement error models often arise in epidemiological and clinical research. Usually, in this set up it is assumed that the latent variable has a normal distribution. However, the normality assumption may not be always correct. Skew-normal/independent distribution is a class of asymmetric thick-tailed distributions which includes the Skew-normal distribution as a special case. In this paper, we explore the use of skew-normal/independent distribution as a robust alternative to null intercept measurement error model under a Bayesian paradigm. We assume that the random errors and the unobserved value of the covariate (latent variable) follows jointly a skew-normal/independent distribution, providing an appealing robust alternative to the routine use of symmetric normal distribution in this type of model. Specific distributions examined include univariate and multivariate versions of the skew-normal distribution, the skew-t distributions, the skew-slash distributions and the skew contaminated normal distributions. The methods developed is illustrated using a real data set from a dental clinical trial. (C) 2008 Elsevier B.V. All rights reserved.
THE EXTENT OF MULTIDISCIPLINARY AUTHORSHIP OF ARTICLES ON SCIENTOMETRICS AND BIBLIOMETRICS IN BRAZIL
Resumo:
The publications in scientometrics and bibliometrics with Brazilian authorship expanded exponentially in the 1990-2006 period, reaching 13 times in the Web of Science database and 19.5 times in the Google Scholar database. This increase is rather superior to that of the total Brazilian scientific production in the same time period (5.6 times in the Web of Science). Some characteristics to be noticed in this rise are: 1) The total number of articles during this period was 197; in that, 78% were published in 57 Brazilian journals and 22% in 13 international journals. 2) The national and international articles averaged 4.3 and 5.9 citations/article, respectively; two journals stood out among these, the national Ciencia da Informacao (44 articles averaging 6.7 citations/article) and the international Scientometrics (32 articles averaging 6.2 citations/article). 3) The articles encompass an impressive participation of authors from areas other than information science; only one-fourth of the authors are bound to the information science field, the remaining ones being distributed among the areas of humanities/business administration, biology/biomedicine, health and hard sciences. The occurrence of adventitious authors at this level of multidisciplinarity is uncommon in science. However, the possible benefits of such patterns are not clear in view of a fragmented intercommunication among the authors, as noticed through the citations. The advantages of changing this trend and of using other scientometric and bibliometric databases, such as SciELO, to avoid an almost exclusive use of the Web of Science database, are discussed.
Resumo:
Social and economical development is closely associated with technological innovation and a well-developed biotechnological industry. In the last few years, Brazil`s scientific production has been steadily increasing; however, the number of patents is lagging behind, with technological and translational research requiring governmental incentive and reinforcement. The Cell and Molecular Therapy Center (NUCEL) was created to develop activities in the translational research field, addressing concrete problems found in biomedical and veterinary areas and actively searching for solutions by employing a genetic engineering approach to generate cell lines over-expressing recombinant proteins to be transferred to local biotech companies, aiming at furthering the development of a national competence for local production of biopharmaceuticals of widespread use and of life-saving importance. To this end, mammalian cell engineering technologies were used to generate cell lines over-expressing several different recombinant proteins of biomedical and biotechnological interest, namely, recombinant human Amylin/IAPP for diabetes treatment, human FVIII and FIX clotting factors for hemophilia, human and bovine FSH for fertility and reproduction, and human bone repair proteins (BMPs). Expression of some of these proteins is also being sought with the baculovirus/insect cell system (BEVS) which, in many cases, is able to deliver high-yield production of recombinant proteins with biological activity comparable to that of mammalian systems, but in a much more cost-effective manner. Transfer of some of these recombinant products to local Biotech companies has been pursued by taking advantage of the Sao Paulo State Foundation (FAPESP) and Federal Government (FINEP, CNPq) incentives for joint Research Development and Innovation partnership projects.
Resumo:
Empirical evidence suggests that real exchange rate is characterized by the presence of near-unity and additive outliers. Recent studeis have found evidence on favor PPP reversion by using the quasi-differencing (Elliott et al., 1996) unit root tests (ERS), which is more efficient against local alternatives but is still based on least squares estimation. Unit root tests basead on least saquares method usually tend to bias inference towards stationarity when additive out liers are present. In this paper, we incorporate quasi-differencing into M-estimation to construct a unit root test that is robust not only against near-unity root but also against nonGaussian behavior provoked by assitive outliers. We re-visit the PPP hypothesis and found less evidemce in favor PPP reversion when non-Gaussian behavior in real exchange rates is taken into account.
Resumo:
Neste trabalho propomos a aplicação das noções de equilíbrio da recente literatura de desenho de mecanismo robusto com aquisição de informação endógena a um problema de divisão de risco entre dois agentes. Através deste exemplo somos capazes de motivar o uso desta noção de equilíbrio, assim como discutir os efeitos da introdu ção de uma restrição de participação que seja dependente da informação. A simplicidade do modelo nos permite caracterizar a possibilidade de implementar a alocação Pareto efiente em termos do custo de aquisição da informação. Além disso, mostramos que a precisão da informação pode ter um efeito negativo sobre a implementação da alocação efi ciente. Ao final, sao dados dois exemplos específicos de situações nas quais este modelo se aplica.
Resumo:
This paper presents a poverty profile for Brazil, based on three different sources of household data for 1996. We use PPV consumption data to estimate poverty and indigence lines. “Contagem” data is used to allow for an unprecedented refinement of the country’s poverty map. Poverty measures and shares are also presented for a wide range of population subgroups, based on the PNAD 1996, with new adjustments for imputed rents and spatial differences in cost of living. Robustness of the profile is verified with respect to different poverty lines, spatial price deflators, and equivalence scales. Overall poverty incidence ranges from 23% with respect to an indigence line to 45% with respect to a more generous poverty line. More importantly, however, poverty is found to vary significantly across regions and city sizes, with rural areas, small and medium towns and the metropolitan peripheries of the North and Northeast regions being poorest.
Resumo:
A forte alta dos imóveis no Brasil nos últimos anos iniciou um debate sobre a possível existência de uma bolha especulativa. Dada a recente crise do crédito nos Estados Unidos, é factível questionar se a situação atual no Brasil pode ser comparada à crise americana. Considerando argumentos quantitativos e fundamentais, examina-se o contexto imobiliário brasileiro e questiona-se a sustentabilidade em um futuro próximo. Primeiramente, analisou-se a taxa de aluguel e o nível de acesso aos imóveis e também utilizou-se um modelo do custo real para ver se o mercado está em equilíbrio o não. Depois examinou-se alguns fatores fundamentais que afetam o preço dos imóveis – oferta e demanda, crédito e regulação, fatores culturais – para encontrar evidências que justificam o aumento dos preços dos imóveis. A partir dessas observações tentou-se chegar a uma conclusão sobre a evolução dos preços no mercado imobiliário brasileiro. Enquanto os dados sugerem que os preços dos imóveis estão supervalorizados em comparação ao preço dos aluguéis, há evidências de uma legítima demanda por novos imóveis na emergente classe média brasileira. Um risco maior pode estar no mercado de crédito, altamente alavancado em relação ao consumidor brasileiro. No entanto, não se encontrou evidências que sugerem mais do que uma temporária estabilização ou correção no preço dos imóveis.
Resumo:
We study semiparametric two-step estimators which have the same structure as parametric doubly robust estimators in their second step. The key difference is that we do not impose any parametric restriction on the nuisance functions that are estimated in a first stage, but retain a fully nonparametric model instead. We call these estimators semiparametric doubly robust estimators (SDREs), and show that they possess superior theoretical and practical properties compared to generic semiparametric two-step estimators. In particular, our estimators have substantially smaller first-order bias, allow for a wider range of nonparametric first-stage estimates, rate-optimal choices of smoothing parameters and data-driven estimates thereof, and their stochastic behavior can be well-approximated by classical first-order asymptotics. SDREs exist for a wide range of parameters of interest, particularly in semiparametric missing data and causal inference models. We illustrate our method with a simulation exercise.