20 resultados para standard on auditing
em Repositório Científico do Instituto Politécnico de Lisboa - Portugal
Resumo:
In order to correctly assess the biaxial fatigue material properties one must experimentally test different load conditions and stress levels. With the rise of new in-plane biaxial fatigue testing machines, using smaller and more efficient electrical motors, instead of the conventional hydraulic machines, it is necessary to reduce the specimen size and to ensure that the specimen geometry is appropriated for the load capacity installed. At the present time there are no standard specimen’s geometries and the indications on literature how to design an efficient test specimen are insufficient. The main goal of this paper is to present the methodology on how to obtain an optimal cruciform specimen geometry, with thickness reduction in the gauge area, appropriated for fatigue crack initiation, as a function of the base material sheet thickness used to build the specimen. The geometry is optimized for maximum stress using several parameters, ensuring that in the gauge area the stress is uniform and maximum with two limit phase shift loading conditions. Therefore the fatigue damage will always initiate on the center of the specimen, avoiding failure outside this region. Using the Renard Series of preferred numbers for the base material sheet thickness as a reference, the reaming geometry parameters are optimized using a derivative-free methodology, called direct multi search (DMS) method. The final optimal geometry as a function of the base material sheet thickness is proposed, as a guide line for cruciform specimens design, and as a possible contribution for a future standard on in-plane biaxial fatigue tests. © 2014, Gruppo Italiano Frattura. All rights reserved.
Resumo:
In order to correctly assess the biaxial fatigue material properties one must experimentally test different load conditions and stress levels. With the rise of new in-plane biaxial fatigue testing machines, using smaller and more efficient electrical motors, instead of the conventional hydraulic machines, it is necessary to reduce the specimen size and to ensure that the specimen geometry is appropriate for the load capacity installed. At the present time there are no standard specimen's geometries and the indications on literature how to design an efficient test specimen are insufficient. The main goal of this paper is to present the methodology on how to obtain an optimal cruciform specimen geometry, with thickness reduction in the gauge area, appropriate for fatigue crack initiation, as a function of the base material sheet thickness used to build the specimen. The geometry is optimized for maximum stress using several parameters, ensuring that in the gauge area the stress distributions on the loading directions are uniform and maximum with two limit phase shift loading conditions (delta = 0 degrees and (delta = 180 degrees). Therefore the fatigue damage will always initiate on the center of the specimen, avoiding failure outside this region. Using the Renard Series of preferred numbers for the base material sheet thickness as a reference, the reaming geometry parameters are optimized using a derivative-free methodology, called direct multi search (DMS) method. The final optimal geometry as a function of the base material sheet thickness is proposed, as a guide line for cruciform specimens design, and as a possible contribution for a future standard on in-plane biaxial fatigue tests
Resumo:
The growing heterogeneity of networks, devices and consumption conditions asks for flexible and adaptive video coding solutions. The compression power of the HEVC standard and the benefits of the distributed video coding paradigm allow designing novel scalable coding solutions with improved error robustness and low encoding complexity while still achieving competitive compression efficiency. In this context, this paper proposes a novel scalable video coding scheme using a HEVC Intra compliant base layer and a distributed coding approach in the enhancement layers (EL). This design inherits the HEVC compression efficiency while providing low encoding complexity at the enhancement layers. The temporal correlation is exploited at the decoder to create the EL side information (SI) residue, an estimation of the original residue. The EL encoder sends only the data that cannot be inferred at the decoder, thus exploiting the correlation between the original and SI residues; however, this correlation must be characterized with an accurate correlation model to obtain coding efficiency improvements. Therefore, this paper proposes a correlation modeling solution to be used at both encoder and decoder, without requiring a feedback channel. Experiments results confirm that the proposed scalable coding scheme has lower encoding complexity and provides BD-Rate savings up to 3.43% in comparison with the HEVC Intra scalable extension under development. © 2014 IEEE.
Resumo:
We write down the renormalization-group equations for the Yukawa-coupling matrices in a general multi-Higgs-doublet model. We then assume that the matrices of the Yukawa couplings of the various Higgs doublets to right-handed fermions of fixed quantum numbers are all proportional to each other. We demonstrate that, in the case of the two-Higgs-doublet model, this proportionality is preserved by the renormalization-group running only in the cases of the standard type-I, II, X, and Y models. We furthermore show that a similar result holds even when there are more than two Higgs doublets: the Yukawa-coupling matrices to fermions of a given electric charge remain proportional under the renormalization-group running if and only if there is a basis for the Higgs doublets in which all the fermions of a given electric charge couple to only one Higgs doublet.
Resumo:
A newly developed solid-state repetitive high-voltage (HV) pulse modulator topology created from the mature concept of the d.c. voltage multiplier (VM) is described. The proposed circuit is based in a voltage multiplier type circuit, where a number of d.c. capacitors share a common connection with different voltage rating in each one. Hence, besides the standard VM rectifier and coupling diodes, two solid-state on/off switches are used, in each stage, to switch from the typical charging VM mode to a pulse mode with the d.c. capacitors connected in series with the load. Due to the on/off semiconductor configuration, in half-bridge structures, the maximum voltage blocked by each one is the d.c. capacitor voltage in each stage. A 2 kV prototype is described and the results are compared with PSPICE simulations.
Resumo:
It is shown that type I seesaw models based on the standard model Lagrangian extended with three heavy Majorana right-handed fields do not have leptogenesis in leading order, if the symmetries of mass matrices are also the residual symmetry of the Lagrangian. In particular, flavor models that lead to a mass-independent leptonic mixing have a vanishing leptogenesis CP asymmetry. Based on symmetry arguments, we prove that in these models the Dirac-neutrino Yukawa coupling combinations relevant for leptogenesis are diagonal in the physical basis where the charged leptons and heavy Majorana neutrinos are diagonal.
Resumo:
We study the implications of the searches based on H -> tau(+)tau-by the ATLAS and CMS collaborations on the parameter space of the two-Higgs-doublet model (2HDM). In the 2HDM, the scalars can decay into a tau pair with a branching ratio larger than the SM one, leading to constraints on the 2HDM parameter space. We show that in model II, values of tan beta > 1.8 are definitively excluded if the pseudoscalar is in the mass range 110 GeV < m(A) < 145 GeV. We have also discussed the implications for the 2HDM of the recent dimuon search by the ATLAS collaboration for a CP-odd scalar in the mass range 4-12 GeV.
Resumo:
This paper, reports experimental work on the use of new heterogeneous solid basic catalysts for biodiesel production: double oxides of Mg and Al, produced by calcination, at high temperature, of MgAl lamellar structures, the hydrotalcites (HT). The most suitable catalyst system studied are hydrotalcite Mg:Al 2:1 calcinated at 507 degrees C and 700 degrees C, leading to higher values of FAME also in the second reaction stage. One of the prepared catalysts resulted in 97.1% Fatty acids methyl esters (FAME) in the 1st reaction step, 92.2% FAME in the 2nd reaction step and 34% FAME in the 3rd reaction step. The biodiesel obtained in the transesterification reaction showed composition and quality parameters within the limits specified by the European Standard EN 14214. 2.5% wt catalyst/oil and a molar ratio methanol:oil of 9:1 or 12:1 at 60 -65 degrees C and 4 h of reaction time are the best operating conditions achieved in this study. This study showed the potential of Mg/Al hydrotalcites as heterogeneous catalysts for biodiesel production. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
CoDeSys "Controller Development Systems" is a development environment for programming in the area of automation controllers. It is an open source solution completely in line with the international industrial standard IEC 61131-3. All five programming languages for application programming as defined in IEC 61131-3 are available in the development environment. These features give professionals greater flexibility with regard to programming and allow control engineers have the ability to program for many different applications in the languages in which they feel most comfortable. Over 200 manufacturers of devices from different industrial sectors offer intelligent automation devices with a CoDeSys programming interface. In 2006, version 3 was released with new updates and tools. One of the great innovations of the new version of CoDeSys is object oriented programming. Object oriented programming (OOP) offers great advantages to the user for example when wanting to reuse existing parts of the application or when working on one application with several developers. For this reuse can be prepared a source code with several well known parts and this is automatically generated where necessary in a project, users can improve then the time/cost/quality management. Until now in version 2 it was necessary to have hardware interface called “Eni-Server” to have access to the generated XML code. Another of the novelties of the new version is a tool called Export PLCopenXML. This tool makes it possible to export the open XML code without the need of specific hardware. This type of code has own requisites to be able to comply with the standard described above. With XML code and with the knowledge how it works it is possible to do component-oriented development of machines with modular programming in an easy way. Eplan Engineering Center (EEC) is a software tool developed by Mind8 GmbH & Co. KG that allows configuring and generating automation projects. Therefore it uses modules of PLC code. The EEC already has a library to generate code for CoDeSys version 2. For version 3 and the constant innovation of drivers by manufacturers, it is necessary to implement a new library in this software. Therefore it is important to study the XML export to be then able to design any type of machine. The purpose of this master thesis is to study the new version of the CoDeSys XML taking into account all aspects and impact on the existing CoDeSys V2 models and libraries in the company Harro Höfliger Verpackungsmaschinen GmbH. For achieve this goal a small sample named “Traffic light” in CoDeSys version 2 will be done and then, using the tools of the new version it there will be a project with version 3 and also the EEC implementation for the automatically generated code.
Resumo:
Background: A common task in analyzing microarray data is to determine which genes are differentially expressed across two (or more) kind of tissue samples or samples submitted under experimental conditions. Several statistical methods have been proposed to accomplish this goal, generally based on measures of distance between classes. It is well known that biological samples are heterogeneous because of factors such as molecular subtypes or genetic background that are often unknown to the experimenter. For instance, in experiments which involve molecular classification of tumors it is important to identify significant subtypes of cancer. Bimodal or multimodal distributions often reflect the presence of subsamples mixtures. Consequently, there can be genes differentially expressed on sample subgroups which are missed if usual statistical approaches are used. In this paper we propose a new graphical tool which not only identifies genes with up and down regulations, but also genes with differential expression in different subclasses, that are usually missed if current statistical methods are used. This tool is based on two measures of distance between samples, namely the overlapping coefficient (OVL) between two densities and the area under the receiver operating characteristic (ROC) curve. The methodology proposed here was implemented in the open-source R software. Results: This method was applied to a publicly available dataset, as well as to a simulated dataset. We compared our results with the ones obtained using some of the standard methods for detecting differentially expressed genes, namely Welch t-statistic, fold change (FC), rank products (RP), average difference (AD), weighted average difference (WAD), moderated t-statistic (modT), intensity-based moderated t-statistic (ibmT), significance analysis of microarrays (samT) and area under the ROC curve (AUC). On both datasets all differentially expressed genes with bimodal or multimodal distributions were not selected by all standard selection procedures. We also compared our results with (i) area between ROC curve and rising area (ABCR) and (ii) the test for not proper ROC curves (TNRC). We found our methodology more comprehensive, because it detects both bimodal and multimodal distributions and different variances can be considered on both samples. Another advantage of our method is that we can analyze graphically the behavior of different kinds of differentially expressed genes. Conclusion: Our results indicate that the arrow plot represents a new flexible and useful tool for the analysis of gene expression profiles from microarrays.
Resumo:
In the two Higgs doublet model, there is the possibility that the vacuum where the universe resides in is metastable. We present the tree-level bounds on the scalar potential parameters which have to be obeyed to prevent that situation. Analytical expressions for those bounds are shown for the most used potential, that with a softly broken Z(2) symmetry. The impact of those bounds on the model's phenomenology is discussed in detail, as well as the importance of the current LHC results in determining whether the vacuum we live in is or is not stable. We demonstrate how the vacuum stability bounds can be obtained for the most generic CP-conserving potential, and provide a simple method to implement them.
The use of non-standard CT conversion ramps for Monte Carlo verification of 6 MV prostate IMRT plans
Resumo:
Monte Carlo (MC) dose calculation algorithms have been widely used to verify the accuracy of intensity-modulated radiotherapy (IMRT) dose distributions computed by conventional algorithms due to the ability to precisely account for the effects of tissue inhomogeneities and multileaf collimator characteristics. Both algorithms present, however, a particular difference in terms of dose calculation and report. Whereas dose from conventional methods is traditionally computed and reported as the water-equivalent dose (Dw), MC dose algorithms calculate and report dose to medium (Dm). In order to compare consistently both methods, the conversion of MC Dm into Dw is therefore necessary. This study aims to assess the effect of applying the conversion of MC-based Dm distributions to Dw for prostate IMRT plans generated for 6 MV photon beams. MC phantoms were created from the patient CT images using three different ramps to convert CT numbers into material and mass density: a conventional four material ramp (CTCREATE) and two simplified CT conversion ramps: (1) air and water with variable densities and (2) air and water with unit density. MC simulations were performed using the BEAMnrc code for the treatment head simulation and the DOSXYZnrc code for the patient dose calculation. The conversion of Dm to Dw by scaling with the stopping power ratios of water to medium was also performed in a post-MC calculation process. The comparison of MC dose distributions calculated in conventional and simplified (water with variable densities) phantoms showed that the effect of material composition on dose-volume histograms (DVH) was less than 1% for soft tissue and about 2.5% near and inside bone structures. The effect of material density on DVH was less than 1% for all tissues through the comparison of MC distributions performed in the two simplified phantoms considering water. Additionally, MC dose distributions were compared with the predictions from an Eclipse treatment planning system (TPS), which employed a pencil beam convolution (PBC) algorithm with Modified Batho Power Law heterogeneity correction. Eclipse PBC and MC calculations (conventional and simplified phantoms) agreed well (<1%) for soft tissues. For femoral heads, differences up to 3% were observed between the DVH for Eclipse PBC and MC calculated in conventional phantoms. The use of the CT conversion ramp of water with variable densities for MC simulations showed no dose discrepancies (0.5%) with the PBC algorithm. Moreover, converting Dm to Dw using mass stopping power ratios resulted in a significant shift (up to 6%) in the DVH for the femoral heads compared to the Eclipse PBC one. Our results show that, for prostate IMRT plans delivered with 6 MV photon beams, no conversion of MC dose from medium to water using stopping power ratio is needed. In contrast, MC dose calculations using water with variable density may be a simple way to solve the problem found using the dose conversion method based on the stopping power ratio.
Resumo:
A new high performance architecture for the computation of all the DCT operations adopted in the H.264/AVC and HEVC standards is proposed in this paper. Contrasting to other dedicated transform cores, the presented multi-standard transform architecture is supported on a completely configurable, scalable and unified structure, that is able to compute not only the forward and the inverse 8×8 and 4×4 integer DCTs and the 4×4 and 2×2 Hadamard transforms defined in the H.264/AVC standard, but also the 4×4, 8×8, 16×16 and 32×32 integer transforms adopted in HEVC. Experimental results obtained using a Xilinx Virtex-7 FPGA demonstrated the superior performance and hardware efficiency levels provided by the proposed structure, which outperforms its more prominent related designs by at least 1.8 times. When integrated in a multi-core embedded system, this architecture allows the computation, in real-time, of all the transforms mentioned above for resolutions as high as the 8k Ultra High Definition Television (UHDTV) (7680×4320 @ 30fps).
Resumo:
Objectives - Identify radiographers’ postures during frequent mammography procedures related to the mammography equipment and patient characteristics. Methods - A postural task analysis was performed using images acquired during the simulation of mammography positioning procedures. Simulations included craniocaudal/(CC) and mediolateral-oblique/(MLO) positioning in three different settings: radiographers and patients with similar statures, radiographers smaller than the patients and radiographers taller than the patients. Measurements of postural angles were performed by two raters using adequate software and classified according to the European Standard EN1005-4:2005 + A1:2008. Results - The simulations revealed that the most awkward posture in mammography is during the positioning of MLO projection in short-stature patients. Postures identified as causing work-related musculoskeletal disorder (WRMSD) risk were neck extension, arms elevated and the back stooped, presenting angles of 87.2, 118.6 and 63.6, respectively. If radiographers were taller than patients, then the trunk and arm postures were not acceptable. Conclusions - Working in a mammography room leads to awkward postures that can have an impact on radiographers’ health, namely WRMSDs. The results in this study showed that there are non-acceptable postures associated with frequent working procedures in mammography. MLO is the most demanding procedure for radiographer postures and may be related to WRMSDs. Mammography devices should be redesigned considering adjustability for radiographers. Main Messages: • Mammography constraints for radiographers in mammography procedures have not been well studied. • Performing mammography leads to awkward postures that can impact radiographers’ health. • MLO, the most demanding procedure for radiographers, is possibly related to WRMSDs.
Resumo:
Background: In Angola, malaria is an endemic disease having a major impact on the economy. The WHO recommends testing for all suspected malaria cases, to avoid the presumptive treatment of this disease. In malaria endemic regions laboratory technicians must be very comfortable with microscopy, the golden standard for malaria diagnosis, to avoid the incorrect diagnosis. The improper use of medication promotes drug resistance and undesirable side effects. The present study aims to assess the impact of a three-day refresher course on the knowledge of technicians, quality of blood smears preparation and accuracy of microscopy malaria diagnosis, using qPCR as reference method. Methods: This study was implemented in laboratories from three hospitals in different provinces of Angola: Bengo, Benguela and Luanda. In each laboratory samples were collected before and after the training course (slide with thin and thick blood smears, a dried blood spot and a form). The impact of the intervention was evaluated through a written test, the quality of slide preparation and the performance of microscopy. Results: It was found a significant increase on the written test median score, from 52.5% to 65.0%. A total of 973 slides were analysed to evaluate the quality of thick and thin blood smears. Considering all laboratories there was a significant increase in quality of thick and thin blood smears. To determine the performance of microscopy using qPCR as the reference method we used 1,028 samples. Benguela presented the highest values for specificity, 92.9% and 98.8% pre and post-course, respectively and for sensitivity the best pre-course was Benguela (75.9%) and post-course Luanda (75.0%). However, no significant increase in sensitivity and specificity after the training course was registered in any laboratory analysed. Discussion: The findings of this study support the need of continuous refresher training for microscopists and other laboratory staff. The laboratories should have a quality control programme to supervise the diagnosis and also to assess the periodicity of new training. However, other variables needed to be considered to have a correct malaria diagnosis, such as adequate equipment and reagents for staining and visualization, good working conditions, motivated and qualified personnel.