940 resultados para practical epistemology analysis
Resumo:
The use of the Design by Analysis (DBA) route is a modern trend in pressure vessel and piping international codes in mechanical engineering. However, to apply the DBA to structures under variable mechanical and thermal loads, it is necessary to assure that the plastic collapse modes, alternate plasticity and incremental collapse (with instantaneous plastic collapse as a particular case), be precluded. The tool available to achieve this target is the shakedown theory. Unfortunately, the practical numerical applications of the shakedown theory result in very large nonlinear optimization problems with nonlinear constraints. Precise, robust and efficient algorithms and finite elements to solve this problem in finite dimension has been a more recent achievements. However, to solve real problems in an industrial level, it is necessary also to consider more realistic material properties as well as to accomplish 3D analysis. Limited kinematic hardening, is a typical property of the usual steels and it should be considered in realistic applications. In this paper, a new finite element with internal thermodynamical variables to model kinematic hardening materials is developed and tested. This element is a mixed ten nodes tetrahedron and through an appropriate change of variables is possible to embed it in a shakedown analysis software developed by Zouain and co-workers for elastic ideally-plastic materials, and then use it to perform 3D shakedown analysis in cases with limited kinematic hardening materials
Resumo:
The use of the Design by Analysis concept is a trend in modern pressure vessel and piping calculations. DBA flexibility allow us to deal with unexpected configurations detected at in-service inspections. It is also important, in life extension calculations, when deviations of the original standard hypotesis adopted initially in Design by Formula, can happen. To apply the DBA to structures under variable mechanic and thermal loads, it is necessary that, alternate plasticity and incremental collapse (with instantaneous plastic collapse as a particular case), be precluded. These are two basic failure modes considered by ASME or European Standards in DBA. The shakedown theory is the tool available to achieve this goal. In order to apply it, is necessary only the range of the variable loads and the material properties. Precise, robust and efficient algorithms to solve the very large nonlinear optimization problems generated in numerical applications of the shakedown theory is a recent achievement. Zouain and co-workers developed one of these algorithms for elastic ideally-plastic materials. But, it is necessary to consider more realistic material properties in real practical applications. This paper shows an enhancement of this algorithm to dealing with limited kinematic hardening, a typical property of the usual steels. This is done using internal thermodynamic variables. A discrete algorithm is obtained using a plane stress, mixed finite element, with internal variable. An example, a beam encased in an end, under constant axial force and variable moment is presented to show the importance of considering the limited kinematic hardening in a shakedown analysis.
Resumo:
In design or safety assessment of mechanical structures, the use of the Design by Analysis (DBA) route is a modern trend. However, for making possible to apply DBA to structures under variable loads, two basic failure modes considered by ASME or European Standards must be precluded. Those modes are the alternate plasticity and incremental collapse (with instantaneous plastic collapse as a particular case). Shakedown theory is a tool that permit us to assure that those kinds of failures will be avoided. However, in practical applications, very large nonlinear optimization problems are generated. Due to this facts, only in recent years have been possible to obtain algorithms sufficiently accurate, robust and efficient, for dealing with this class of problems. In this paper, one of these shakedown algorithms, developed for dealing with elastic ideally-plastic structures, is enhanced to include limited kinematic hardening, a more realistic material behavior. This is done in the continuous model by using internal thermodynamic variables. A corresponding discrete model is obtained using an axisymmetric mixed finite element with an internal variable. A thick wall sphere, under variable thermal and pressure loads, is used in an example to show the importance of considering the limited kinematic hardening in the shakedown calculations
Resumo:
Purpose – The objective of this exploratory study is to investigate the “flow-through” or relationship between top-line measures of hotel operating performance (occupancy, average daily rate and revenue per available room) and bottom-line measures of profitability (gross operating profit and net operating income), before and during the recent great recession. Design/methodology/approach – This study uses data provided by PKF Hospitality Research for the period from 2007-2009. A total of 714 hotels were analyzed and various top-line and bottom-line profitability changes were computed using both absolute levels and percentages. Multiple regression analysis was used to examine the relationship between top and bottom line measures, and to derive flow-through ratios. Findings – The results show that average daily rate (ADR) and occupancy are significantly and positively related to gross operating profit per available room (GOPPAR) and net operating income per available room (NOIPAR). The evidence indicates that ADR, rather than occupancy, appears to be the stronger predictor and better measure of RevPAR growth and bottom-line profitability. The correlations and explained variances are also higher than those reported in prior research. Flow-through ratios range between 1.83 and 1.91 for NOIPAR, and between 1.55 and 1.65 for GOPPAR, across all chain-scales. Research limitations/implications – Limitations of this study include the limited number of years in the study period, limited number of hotels in a competitive set, and self-selection of hotels by the researchers. Practical implications – While ADR and occupancy work in combination to drive profitability, the authors' study shows that ADR is the stronger predictor of profitability. Hotel managers can use flow-through ratios to make financial forecasts, or use them as inputs in valuation models, to forecast future profitability. Originality/value – This paper extends prior research on the relationship between top-line measures and bottom-line profitability and serves to inform lodging owners, operators and asset managers about flow-through ratios, and how these ratios impact hotel profitability.
Resumo:
The highly dynamic nature of some sandy shores with continuous morphological changes require the development of efficient and accurate methodological strategies for coastal hazard assessment and morphodynamic characterisation. During the past decades, the general methodological approach for the establishment of coastal monitoring programmes was based on photogrammetry or classical geodetic techniques. With the advent of new geodetic techniques, space-based and airborne-based, new methodologies were introduced in coastal monitoring programmes. This paper describes the development of a monitoring prototype that is based on the use of global positioning system (GPS). The prototype has a GPS multiantenna mounted on a fast surveying platform, a land vehicle appropriate for driving in the sand (four-wheel quad). This system was conceived to perform a network of shore profiles in sandy shores stretches (subaerial beach) that extend for several kilometres from which high-precision digital elevation models can be generated. An analysis of the accuracy and precision of some differential GPS kinematic methodologies is presented. The development of an adequate survey methodology is the first step in morphodynamic shore characterisation or in coastal hazard assessment. The sample method and the computational interpolation procedures are important steps for producing reliable three-dimensional surface maps that are real as possible. The quality of several interpolation methods used to generate grids was tested in areas where there were data gaps. The results obtained allow us to conclude that with the developed survey methodology, it is possible to Survey sandy shores stretches, under spatial scales of kilometers, with a vertical accuracy of greater than 0.10 m in the final digital elevation models.
Resumo:
In a world where students are increasing digitally tethered to powerful, ‘always on’ mobile devices, new models of engagement and approaches to teaching and learning are required from educators. Serious Games (SG) have proved to have instructional potential but there is still a lack of methodologies and tools not only for their design but also to support game analysis and assessment. This paper explores the use of SG to increase student engagement and retention. The development phase of the Circuit Warz game is presented to demonstrate how electronic engineering education can be radically reimagined to create immersive, highly engaging learning experiences that are problem-centered and pedagogically sound. The Learning Mechanics–Game Mechanics (LM-GM) framework for SG game analysis is introduced and its practical use in an educational game design scenario is shown as a case study.
Resumo:
Donna Soto-Morettini is one of the top performance coaches in the industry and has worked as casting director and performance coach for the hit BBC reality casting shows, I'd Do Anything, Any Dream Will Do, and How Do You Solve a Problem Like Maria. She was the founding senior vocal coach at Paul McCartney's Liverpool Institute for Performing Arts. Based on her years of teaching experience in a multitude of styles, this unique book is a practical guide to exploring the singing voice and will help to enhance vocal confidence in a range of styles including Pop, Jazz, Blues, Rock, Country and Gospel. Both singers and voice teachers will benefit from the clear analysis of these styles and advice on how to improve performance. Popular Singing provides effective alternatives to traditional voice training methods and demonstrates how these methods can be used to create a flexible and unique sound. A free CD of voice demonstrations is also included.
Resumo:
The phase difference principle is widely applied nowadays to sonar systems used for sea floor bathymetry, The apparent angle of a target point is obtained from the phase difference measured between two close receiving arrays. Here we study the influence of the phase difference estimation errors caused by the physical structure of the backscattered signals. It is shown that, under certain current conditions, beyond the commonly considered effects of additive external noise and baseline decorrelation, the processing may be affected by the shifting footprint effect: this is due to the fact that the two interferometer receivers get simultaneous echo contributions coming from slightly shifted seabed parts, which results in a degradation of the signal coherence and, hence, of the phase difference measurement. This geometrical effect is described analytically and checked with numerical simulations, both for square- and sine-shaped signal envelopes. Its relative influence depends on the geometrical configuration and receiver spacing; it may be prevalent in practical cases associated with bathymetric sonars. The cases of square and smooth signal envelopes are both considered. The measurements close to nadir, which are known to be especially difficult with interferometry systems, are addressed in particular.
Resumo:
This paper analyzes the inner relations between classical sub-scheme probability and statistic probability, subjective probability and objective probability, prior probability and posterior probability, transition probability and probability of utility, and further analysis the goal, method, and its practical economic purpose which represent by these various probability from the perspective of mathematics, so as to deeply understand there connotation and its relation with economic decision making, thus will pave the route for scientific predication and decision making.
Resumo:
It has been proposed that long-term consumption of diets rich in non-digestible carbohydrates (NDCs), such as cereals, fruit and vegetables might protect against several chronic diseases, however, it has been difficult to fully establish their impact on health in epidemiology studies. The wide range properties of the different NDCs may dilution their impact when they are combined in one category for statistical comparisons in correlations or multivariate analysis. Several mechanisms have been suggested to explain the protective effects of NDCs, including increased stool bulk, dilution of carcinogens in the colonic lumen, reduced transit time, lowering pH, and bacterial fermentation to short chain fatty acids (SCFA) in the colon. However, it is very difficult to measure SCFA in humans in vivo with any accuracy, so epidemiological studies on the impact of SCFA are not feasible. Most studies use dietary fibre (DF) or Non-Starch Polysaccharides (NSP) intake to estimate the levels, but not all fibres or NSP are equally fermentable. It has been proposed that long-term consumption of diets rich in non-digestible carbohydrates (NDCs), such as cereals, fruit and vegetables might protect against several chronic diseases, however, it has been difficult to fully establish their impact on health in epidemiology studies. The wide range properties of the different NDCs may dilution their impact when they are combined in one category for statistical comparisons in correlations or multivariate analysis. Several mechanisms have been suggested to explain the protective effects of NDCs, including increased stool bulk, dilution of carcinogens in the colonic lumen, reduced transit time, lowering pH, and bacterial fermentation to short chain fatty acids (SCFA) in the colon. However, it is very difficult to measure SCFA in humans in vivo with any accuracy, so epidemiological studies on the impact of SCFA are not feasible. Most studies use dietary fibre (DF) or Non-Starch Polysaccharides (NSP) intake to estimate the levels, but not all fibres or NSP are equally fermentable. The first aim of this thesis was the development of the equations used to estimate the amount of FC that reaches the human colon and is fermented fully to SCFA by the colonic bacteria. Therefore, several studies were examined for evidence to determine the different percentages of each type of NDCs that should be included in the final model, based on how much NDCs entered the colon intact and also to what extent they were fermented to SCFA in vivo. Our model equations are FC-DF or NSP$ 1: 100 % Soluble + 10 % insoluble + 100 % NDOs¥ + 5 % TS** FC-DF or NSP 2: 100 % Soluble + 50 % insoluble + 100 % NDOs + 5 % TS FC-DF* or NSP 3: 100 % Soluble + 10 % insoluble + 100 % NDOs + 10 % TS FC-DF or NSP 4: 100 % Soluble + 50 % insoluble + 100 % NDOs + 10 % TS *DF: Dietary fibre; **TS: Total starch; $NSP: non-starch polysaccharide; ¥NDOs: non-digestible oligosaccharide The second study of this thesis aimed to examine all four predicted FC-DF and FC-NSP equations developed, to estimate FC from dietary records against urinary colonic NDCs fermentation biomarkers. The main finding of a cross-sectional comparison of habitual diet with urinary excretion of SCFA products, showed weak but significant correlation between the 24 h urinary excretion of SCFA and acetate with the estimated FC-DF 4 and FC-NSP 4 when considering all of the study participants (n = 122). Similar correlations were observed with the data for valid participants (n = 78). It was also observed that FC-DF and FC-NSP had positive correlations with 24 h urinary acetate and SCFA compared with DF and NSP alone. Hence, it could be hypothesised that using the developed index to estimate FC in the diet form dietary records, might predict SCFA production in the colon in vivo in humans. The next study in this thesis aimed to validate the FC equations developed using in vitro models of small intestinal digestion and human colon fermentation. The main findings in these in vitro studies were that there were several strong agreements between the amounts of SCFA produced after actual in vitro fermentation of single fibre and different mixtures of NDCs, and those predicted by the estimated FC from our developed equation FC-DF 4. These results which demonstrated a strong relationship between SCFA production in vitro from a range of fermentations of single fibres and mixtures of NDCs and that from the predicted FC equation, support the use of the FC equation for estimation of FC from dietary records. Therefore, we can conclude that the newly developed predicted equations have been deemed a valid and practical tool to assess SCFA productions for in vitro fermentation.
Resumo:
This dissertation research points out major challenging problems with current Knowledge Organization (KO) systems, such as subject gateways or web directories: (1) the current systems use traditional knowledge organization systems based on controlled vocabulary which is not very well suited to web resources, and (2) information is organized by professionals not by users, which means it does not reflect intuitively and instantaneously expressed users’ current needs. In order to explore users’ needs, I examined social tags which are user-generated uncontrolled vocabulary. As investment in professionally-developed subject gateways and web directories diminishes (support for both BUBL and Intute, examined in this study, is being discontinued), understanding characteristics of social tagging becomes even more critical. Several researchers have discussed social tagging behavior and its usefulness for classification or retrieval; however, further research is needed to qualitatively and quantitatively investigate social tagging in order to verify its quality and benefit. This research particularly examined the indexing consistency of social tagging in comparison to professional indexing to examine the quality and efficacy of tagging. The data analysis was divided into three phases: analysis of indexing consistency, analysis of tagging effectiveness, and analysis of tag attributes. Most indexing consistency studies have been conducted with a small number of professional indexers, and they tended to exclude users. Furthermore, the studies mainly have focused on physical library collections. This dissertation research bridged these gaps by (1) extending the scope of resources to various web documents indexed by users and (2) employing the Information Retrieval (IR) Vector Space Model (VSM) - based indexing consistency method since it is suitable for dealing with a large number of indexers. As a second phase, an analysis of tagging effectiveness with tagging exhaustivity and tag specificity was conducted to ameliorate the drawbacks of consistency analysis based on only the quantitative measures of vocabulary matching. Finally, to investigate tagging pattern and behaviors, a content analysis on tag attributes was conducted based on the FRBR model. The findings revealed that there was greater consistency over all subjects among taggers compared to that for two groups of professionals. The analysis of tagging exhaustivity and tag specificity in relation to tagging effectiveness was conducted to ameliorate difficulties associated with limitations in the analysis of indexing consistency based on only the quantitative measures of vocabulary matching. Examination of exhaustivity and specificity of social tags provided insights into particular characteristics of tagging behavior and its variation across subjects. To further investigate the quality of tags, a Latent Semantic Analysis (LSA) was conducted to determine to what extent tags are conceptually related to professionals’ keywords and it was found that tags of higher specificity tended to have a higher semantic relatedness to professionals’ keywords. This leads to the conclusion that the term’s power as a differentiator is related to its semantic relatedness to documents. The findings on tag attributes identified the important bibliographic attributes of tags beyond describing subjects or topics of a document. The findings also showed that tags have essential attributes matching those defined in FRBR. Furthermore, in terms of specific subject areas, the findings originally identified that taggers exhibited different tagging behaviors representing distinctive features and tendencies on web documents characterizing digital heterogeneous media resources. These results have led to the conclusion that there should be an increased awareness of diverse user needs by subject in order to improve metadata in practical applications. This dissertation research is the first necessary step to utilize social tagging in digital information organization by verifying the quality and efficacy of social tagging. This dissertation research combined both quantitative (statistics) and qualitative (content analysis using FRBR) approaches to vocabulary analysis of tags which provided a more complete examination of the quality of tags. Through the detailed analysis of tag properties undertaken in this dissertation, we have a clearer understanding of the extent to which social tagging can be used to replace (and in some cases to improve upon) professional indexing.
Resumo:
As climate change continues to impact socio-ecological systems, tools that assist conservation managers to understand vulnerability and target adaptations are essential. Quantitative assessments of vulnerability are rare because available frameworks are complex and lack guidance for dealing with data limitations and integrating across scales and disciplines. This paper describes a semi-quantitative method for assessing vulnerability to climate change that integrates socio-ecological factors to address management objectives and support decision-making. The method applies a framework first adopted by the Intergovernmental Panel on Climate Change and uses a structured 10-step process. The scores for each framework element are normalized and multiplied to produce a vulnerability score and then the assessed components are ranked from high to low vulnerability. Sensitivity analyses determine which indicators most influence the analysis and the resultant decision-making process so data quality for these indicators can be reviewed to increase robustness. Prioritisation of components for conservation considers other economic, social and cultural values with vulnerability rankings to target actions that reduce vulnerability to climate change by decreasing exposure or sensitivity and/or increasing adaptive capacity. This framework provides practical decision-support and has been applied to marine ecosystems and fisheries, with two case applications provided as examples: (1) food security in Pacific Island nations under climate-driven fish declines, and (2) fisheries in the Gulf of Carpentaria, northern Australia. The step-wise process outlined here is broadly applicable and can be undertaken with minimal resources using existing data, thereby having great potential to inform adaptive natural resource management in diverse locations.
Resumo:
The accurate prediction of stress histories for the fatigue analysis is of utmost importance for the design process of wind turbine rotor blades. As detailed, transient, and geometrically non-linear three-dimensional finite element analyses are computationally weigh too expensive, it is commonly regarded sufficient to calculate the stresses with a geometrically linear analysis and superimpose different stress states in order to obtain the complete stress histories. In order to quantify the error from geometrically linear simulations for the calculation of stress histories and to verify the practical applicability of the superposition principal in fatigue analyses, this paper studies the influence of geometric non-linearity in the example of a trailing edge bond line, as this subcomponent suffers from high strains in span-wise direction. The blade under consideration is that of the IWES IWT-7.5-164 reference wind turbine. From turbine simulations the highest edgewise loading scenario from the fatigue load cases is used as the reference. A 3D finite element model of the blade is created and the bond line fatigue assessment is performed according to the GL certification guidelines in its 2010 edition, and in comparison to the latest DNV GL standard from end of 2015. The results show a significant difference between the geometrically linear and non-linear stress analyses when the bending moments are approximated via a corresponding external loading, especially in case of the 2010 GL certification guidelines. This finding emphasizes the demand to reconsider the application of the superposition principal in fatigue analyses of modern flexible rotor blades, where geometrical nonlinearities become significant. In addition, a new load application methodology is introduced that reduces the geometrically non-linear behaviour of the blade in the finite element analysis.
Resumo:
Developments in theory and experiment have raised the prospect of an electronic technology based on the discrete nature of electron tunnelling through a potential barrier. This thesis deals with novel design and analysis tools developed to study such systems. Possible devices include those constructed from ultrasmall normal tunnelling junctions. These exhibit charging effects including the Coulomb blockade and correlated electron tunnelling. They allow transistor-like control of the transfer of single carriers, and present the prospect of digital systems operating at the information theoretic limit. As such, they are often referred to as single electronic devices. Single electronic devices exhibit self quantising logic and good structural tolerance. Their speed, immunity to thermal noise, and operating voltage all scale beneficially with junction capacitance. For ultrasmall junctions the possibility of room temperature operation at sub picosecond timescales seems feasible. However, they are sensitive to external charge; whether from trapping-detrapping events, externally gated potentials, or system cross-talk. Quantum effects such as charge macroscopic quantum tunnelling may degrade performance. Finally, any practical system will be complex and spatially extended (amplifying the above problems), and prone to fabrication imperfection. This summarises why new design and analysis tools are required. Simulation tools are developed, concentrating on the basic building blocks of single electronic systems; the tunnelling junction array and gated turnstile device. Three main points are considered: the best method of estimating capacitance values from physical system geometry; the mathematical model which should represent electron tunnelling based on this data; application of this model to the investigation of single electronic systems. (DXN004909)
Resumo:
Dissertação (mestrado)—Universidade de Brasília, Instituto de Ciências Humanas, Departamento de Filosofia, Programa de Pós-Graduação em Filosofia, 2015.