865 resultados para Definition of cuisine
Resumo:
This paper aims at putting into perspective the recent, post 9/11 debate on the United States‘ alleged exceptionalism and its impact on the definition of American foreign policy. It reminds the readers that the United States was born as a result of a similar debate, at a time when a crucial choice for its future was to be made. Indeed, the Founding Fathers discarded the revolutionary idea that America was altogether different from other (European) nations and, as such, could succeed in saving republicanism and concentrate on domestic affairs. As Gordon Wood and Harvey Mansfield have shown, the 1787 version of republicanism stood as a departure from its earlier version, and such a change was necessary to the creation of a full-fledged federation, therefore paving the way to the current powerful Federal Republic. The early failure of the exceptionalist creed did not cause its disappearance, as the contemporary form of exceptionalism demonstrates, but created conditions that made an enduring and powerful influence very difficult.
Resumo:
We propose a low complexity technique to generate amplitude correlated time-series with Nakagami-m distribution and phase correlated Gaussian-distributed time-series, which is useful in the simulation of ionospheric scintillation effects during the transmission of GNSS signals. The method requires only the knowledge of parameters S4 (scintillation index) and σΦ (phase standard deviation) besides the definition of models for the amplitude and phase power spectra. The Zhang algorithm is used to produce Nakagami-distributed signals from a set of Gaussian autoregressive processes.
Resumo:
The mode III interlaminar fracture of carbon/epoxy laminates was evaluated with the edge crack torsion (ECT) test. Three-dimensional finite element analyses were performed in order to select two specimen geometries and an experimental data reduction scheme. Test results showed considerable non-linearity before the maximum load point and a significant R-curve effect. These features prevented an accurate definition of the initiation point. Nevertheless, analyses of non-linearity zones showed two likely initiation points corresponding to GIIIc values between 850 and 1100 J/m2 for both specimen geometries. Although any of these values is realistic, the range is too broad, thus showing the limitations of the ECT test and the need for further research.
Resumo:
To boost logic density and reduce per unit power consumption SRAM-based FPGAs manufacturers adopted nanometric technologies. However, this technology is highly vulnerable to radiation-induced faults, which affect values stored in memory cells, and to manufacturing imperfections. Fault tolerant implementations, based on Triple Modular Redundancy (TMR) infrastructures, help to keep the correct operation of the circuit. However, TMR is not sufficient to guarantee the safe operation of a circuit. Other issues like module placement, the effects of multi- bit upsets (MBU) or fault accumulation, have also to be addressed. In case of a fault occurrence the correct operation of the affected module must be restored and/or the current state of the circuit coherently re-established. A solution that enables the autonomous restoration of the functional definition of the affected module, avoiding fault accumulation, re-establishing the correct circuit state in real-time, while keeping the normal operation of the circuit, is presented in this paper.
Resumo:
The corner stone of the interoperability of eLearning systems is the standard definition of learning objects. Nevertheless, for some domains this standard is insufficient to fully describe all the assets, especially when they are used as input for other eLearning services. On the other hand, a standard definition of learning objects in not enough to ensure interoperability among eLearning systems; they must also use a standard API to exchange learning objects. This paper presents the design and implementation of a service oriented repository of learning objects called crimsonHex. This repository is fully compliant with the existing interoperability standards and supports new definitions of learning objects for specialized domains. We illustrate this feature with the definition of programming problems as learning objects and its validation by the repository. This repository is also prepared to store usage data on learning objects to tailor the presentation order and adapt it to learner profiles.
Resumo:
Adhesive bonding is nowadays a serious candidate to replace methods such as fastening or riveting, because of attractive mechanical properties. As a result, adhesives are being increasingly used in industries such as the automotive, aerospace and construction. Thus, it is highly important to predict the strength of bonded joints to assess the feasibility of joining during the fabrication process of components (e.g. due to complex geometries) or for repairing purposes. This work studies the tensile behaviour of adhesive joints between aluminium adherends considering different values of adherend thickness (h) and the double-cantilever beam (DCB) test. The experimental work consists of the definition of the tensile fracture toughness (GIC) for the different joint configurations. A conventional fracture characterization method was used, together with a J-integral approach, that take into account the plasticity effects occurring in the adhesive layer. An optical measurement method is used for the evaluation of crack tip opening and adherends rotation at the crack tip during the test, supported by a Matlab® sub-routine for the automated extraction of these quantities. As output of this work, a comparative evaluation between bonded systems with different values of adherend thickness is carried out and complete fracture data is provided in tension for the subsequent strength prediction of joints with identical conditions.
Resumo:
The increasing and intensive integration of distributed energy resources into distribution systems requires adequate methodologies to ensure a secure operation according to the smart grid paradigm. In this context, SCADA (Supervisory Control and Data Acquisition) systems are an essential infrastructure. This paper presents a conceptual design of a communication and resources management scheme based on an intelligent SCADA with a decentralized, flexible, and intelligent approach, adaptive to the context (context awareness). The methodology is used to support the energy resource management considering all the involved costs, power flows, and electricity prices leading to the network reconfiguration. The methodology also addresses the definition of the information access permissions of each player to each resource. The paper includes a 33-bus network used in a case study that considers an intensive use of distributed energy resources in five distinct implemented operation contexts.
Resumo:
Demand response programs and models have been developed and implemented for an improved performance of electricity markets, taking full advantage of smart grids. Studying and addressing the consumers’ flexibility and network operation scenarios makes possible to design improved demand response models and programs. The methodology proposed in the present paper aims to address the definition of demand response programs that consider the demand shifting between periods, regarding the occurrence of multi-period demand response events. The optimization model focuses on minimizing the network and resources operation costs for a Virtual Power Player. Quantum Particle Swarm Optimization has been used in order to obtain the solutions for the optimization model that is applied to a large set of operation scenarios. The implemented case study illustrates the use of the proposed methodology to support the decisions of the Virtual Power Player in what concerns the duration of each demand response event.
Resumo:
This work project addresses the importance of succession planning in family-owned Small and Medium Enterprises (SMEs). This is directly related with Human Resources Management (HRM) given that there is an HRM long term vision in order for the succession to be planned on time and benefit the companies. This study focused on SMEs since these are the entities that have a minor focus on HRM practices. A total of 22 in-depth interviews were conducted and analyzed. Selected SMEs owners/managers and successors/antecessors were interviewed with the purpose of acquiring more insight on the level of succession planning, using a qualitative methodology from which the process of succession was derived. This study unveils that the first step in this process is related to the definition of criteria to be a good successor, followed by the choice of possible successors, being the children the natural successors, but also considering other potential ones, and finally some considerations on the future of these companies.
Resumo:
Field lab: Entrepreneurial and innovative ventures
Resumo:
The 2009 International Society of Urological Pathology consensus conference in Boston made recommendations regarding the standardization of pathology reporting of radical prostatectomy specimens. Issues relating to the substaging of pT2 prostate cancers according to the TNM 2002/2010 system, reporting of tumor size/volume and zonal location of prostate cancers were coordinated by working group 2. A survey circulated before the consensus conference demonstrated that 74% of the 157 participants considered pT2 substaging of prostate cancer to be of clinical and/or academic relevance. The survey also revealed a considerable variation in the frequency of reporting of pT2b substage prostate cancer, which was likely a consequence of the variable methodologies used to distinguish pT2a from pT2b tumors. Overview of the literature indicates that current pT2 substaging criteria lack clinical relevance and the majority (65.5%) of conference attendees wished to discontinue pT2 substaging. Therefore, the consensus was that reporting of pT2 substages should, at present, be optional. Several studies have shown that prostate cancer volume is significantly correlated with other clinicopathological features, including Gleason score and extraprostatic extension of tumor; however, most studies fail to demonstrate this to have prognostic significance on multivariate analysis. Consensus was reached with regard to the reporting of some quantitative measure of the volume of tumor in a prostatectomy specimen, without prescribing a specific methodology. Incorporation of the zonal and/or anterior location of the dominant/index tumor in the pathology report was accepted by most participants, but a formal definition of the identifying features of the dominant/index tumor remained undecided.
Resumo:
Fifty-three patients with histologically proven carcinoma were injected with highly purified [131I]-labeled goat antibodies or fragments of antibodies against carcinoembryonic antigen (CEA). Each patient was tested by external photoscanning 4, 24, 36 and 48 h after injection. In 22 patients (16 of 38 injected with intact antibodies, 5 of 13 with F(ab')2 fragments and 1 of 2 with Fab' fragments), an increased concentration of 131I radioactivity corresponding to the previously known tumor location was detected by photoscanning 36-48 h after injection. Blood pool and secreted radioactivity was determined in all patients by injecting 15 min before scanning, [99mTc]-labeled normal serum albumin and free 99mTc04-. The computerized subtraction of 99mTc from 131I radioactivity enhanced the definition of tumor localization in the 22 positive patients. However, in spite of the computerized subtraction, interpretation of the scans remained doubtful for 12 patients and was entirely negative for 19 additional patients. In order to provide a more objective evaluation for the specificity of the tumor localization of antibodies, 14 patients scheduled for tumor resection were injected simultaneously with [131I]-labeled antibodies or fragments and with [125I]-labeled normal goat IgG or fragments. After surgery, the radioactivity of the two isotopes present either in tumor or adjacent normal tissues was measured in a dual channel scintillation counter. The results showed that the antibodies or their fragments were 2-4 times more concentrated in the tumor than in the normal tissues. In addition, it was shown that the injected antibodies formed immune complexes with circulating CEA and that the amount of immune complexes detectable in serum was roughly proportional to the level of circulating CEA.
Resumo:
We have used massively parallel signature sequencing (MPSS) to sample the transcriptomes of 32 normal human tissues to an unprecedented depth, thus documenting the patterns of expression of almost 20,000 genes with high sensitivity and specificity. The data confirm the widely held belief that differences in gene expression between cell and tissue types are largely determined by transcripts derived from a limited number of tissue-specific genes, rather than by combinations of more promiscuously expressed genes. Expression of a little more than half of all known human genes seems to account for both the common requirements and the specific functions of the tissues sampled. A classification of tissues based on patterns of gene expression largely reproduces classifications based on anatomical and biochemical properties. The unbiased sampling of the human transcriptome achieved by MPSS supports the idea that most human genes have been mapped, if not functionally characterized. This data set should prove useful for the identification of tissue-specific genes, for the study of global changes induced by pathological conditions, and for the definition of a minimal set of genes necessary for basic cell maintenance. The data are available on the Web at http://mpss.licr.org and http://sgb.lynxgen.com.
Resumo:
The World Health Organization (WHO) criteria for the diagnosis of osteoporosis are mainly applicable for dual X-ray absorptiometry (DXA) measurements at the spine and hip levels. There is a growing demand for cheaper devices, free of ionizing radiation such as promising quantitative ultrasound (QUS). In common with many other countries, QUS measurements are increasingly used in Switzerland without adequate clinical guidelines. The T-score approach developed for DXA cannot be applied to QUS, although well-conducted prospective studies have shown that ultrasound could be a valuable predictor of fracture risk. As a consequence, an expert committee named the Swiss Quality Assurance Project (SQAP, for which the main mission is the establishment of quality assurance procedures for DXA and QUS in Switzerland) was mandated by the Swiss Association Against Osteoporosis (ASCO) in 2000 to propose operational clinical recommendations for the use of QUS in the management of osteoporosis for two QUS devices sold in Switzerland. Device-specific weighted "T-score" based on the risk of osteoporotic hip fractures as well as on the prediction of DXA osteoporosis at the hip, according to the WHO definition of osteoporosis, were calculated for the Achilles (Lunar, General Electric, Madison, Wis.) and Sahara (Hologic, Waltham, Mass.) ultrasound devices. Several studies (totaling a few thousand subjects) were used to calculate age-adjusted odd ratios (OR) and area under the receiver operating curve (AUC) for the prediction of osteoporotic fracture (taking into account a weighting score depending on the design of the study involved in the calculation). The ORs were 2.4 (1.9-3.2) and AUC 0.72 (0.66-0.77), respectively, for the Achilles, and 2.3 (1.7-3.1) and 0.75 (0.68-0.82), respectively, for the Sahara device. To translate risk estimates into thresholds for clinical application, 90% sensitivity was used to define low fracture and low osteoporosis risk, and a specificity of 80% was used to define subjects as being at high risk of fracture or having osteoporosis at the hip. From the combination of the fracture model with the hip DXA osteoporotic model, we found a T-score threshold of -1.2 and -2.5 for the stiffness (Achilles) determining, respectively, the low- and high-risk subjects. Similarly, we found a T-score at -1.0 and -2.2 for the QUI index (Sahara). Then a screening strategy combining QUS, DXA, and clinical factors for the identification of women needing treatment was proposed. The application of this approach will help to minimize the inappropriate use of QUS from which the whole field currently suffers.
Resumo:
BACKGROUND: Invasive fungal diseases are important causes of morbidity and mortality. Clarity and uniformity in defining these infections are important factors in improving the quality of clinical studies. A standard set of definitions strengthens the consistency and reproducibility of such studies. METHODS: After the introduction of the original European Organization for Research and Treatment of Cancer/Invasive Fungal Infections Cooperative Group and the National Institute of Allergy and Infectious Diseases Mycoses Study Group (EORTC/MSG) Consensus Group definitions, advances in diagnostic technology and the recognition of areas in need of improvement led to a revision of this document. The revision process started with a meeting of participants in 2003, to decide on the process and to draft the proposal. This was followed by several rounds of consultation until a final draft was approved in 2005. This was made available for 6 months to allow public comment, and then the manuscript was prepared and approved. RESULTS: The revised definitions retain the original classifications of "proven," "probable," and "possible" invasive fungal disease, but the definition of "probable" has been expanded, whereas the scope of the category "possible" has been diminished. The category of proven invasive fungal disease can apply to any patient, regardless of whether the patient is immunocompromised, whereas the probable and possible categories are proposed for immunocompromised patients only. CONCLUSIONS: These revised definitions of invasive fungal disease are intended to advance clinical and epidemiological research and may serve as a useful model for defining other infections in high-risk patients.