19 resultados para FEC using Reed-Solomon and Tornado codes
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
TCP flows from applications such as the web or ftp are well supported by a Guaranteed Minimum Throughput Service (GMTS), which provides a minimum network throughput to the flow and, if possible, an extra throughput. We propose a scheme for a GMTS using Admission Control (AC) that is able to provide different minimum throughput to different users and that is suitable for "standard" TCP flows. Moreover, we consider a multidomain scenario where the scheme is used in one of the domains, and we propose some mechanisms for the interconnection with neighbor domains. The whole scheme uses a small set of packet classes in a core-stateless network where each class has a different discarding priority in queues assigned to it. The AC method involves only edge nodes and uses a special probing packet flow (marked as the highest discarding priority class) that is sent continuously from ingress to egress through a path. The available throughput in the path is obtained at the egress using measurements of flow aggregates, and then it is sent back to the ingress. At the ingress each flow is detected using an implicit way and then it is admission controlled. If it is accepted, it receives the GMTS and its packets are marked as the lowest discarding priority classes; otherwise, it receives a best-effort service. The scheme is evaluated through simulation in a simple "bottleneck" topology using different traffic loads consisting of "standard" TCP flows that carry files of varying sizes
Resumo:
Seafloor imagery is a rich source of data for the study of biological and geological processes. Among several applications, still images of the ocean floor can be used to build image composites referred to as photo-mosaics. Photo-mosaics provide a wide-area visual representation of the benthos, and enable applications as diverse as geological surveys, mapping and detection of temporal changes in the morphology of biodiversity. We present an approach for creating globally aligned photo-mosaics using 3D position estimates provided by navigation sensors available in deep water surveys. Without image registration, such navigation data does not provide enough accuracy to produce useful composite images. Results from a challenging data set of the Lucky Strike vent field at the Mid Atlantic Ridge are reported
Resumo:
HEMOLIA (a project under European community’s 7th framework programme) is a new generation Anti-Money Laundering (AML) intelligent multi-agent alert and investigation system which in addition to the traditional financial data makes extensive use of modern society’s huge telecom data source, thereby opening up a new dimension of capabilities to all Money Laundering fighters (FIUs, LEAs) and Financial Institutes (Banks, Insurance Companies, etc.). This Master-Thesis project is done at AIA, one of the partners for the HEMOLIA project in Barcelona. The objective of this thesis is to find the clusters in a network drawn by using the financial data. An extensive literature survey has been carried out and several standard algorithms related to networks have been studied and implemented. The clustering problem is a NP-hard problem and several algorithms like K-Means and Hierarchical clustering are being implemented for studying several problems relating to sociology, evolution, anthropology etc. However, these algorithms have certain drawbacks which make them very difficult to implement. The thesis suggests (a) a possible improvement to the K-Means algorithm, (b) a novel approach to the clustering problem using the Genetic Algorithms and (c) a new algorithm for finding the cluster of a node using the Genetic Algorithm.
Resumo:
CSCL applications are complex distributed systems that posespecial requirements towards achieving success in educationalsettings. Flexible and efficient design of collaborative activitiesby educators is a key precondition in order to provide CSCL tailorable systems, capable of adapting to the needs of eachparticular learning environment. Furthermore, some parts ofthose CSCL systems should be reused as often as possible inorder to reduce development costs. In addition, it may be necessary to employ special hardware devices, computational resources that reside in other organizations, or even exceed thepossibilities of one specific organization. Therefore, theproposal of this paper is twofold: collecting collaborativelearning designs (scripting) provided by educators, based onwell-known best practices (collaborative learning flow patterns) in a standard way (IMS-LD) in order to guide the tailoring of CSCL systems by selecting and integrating reusable CSCL software units; and, implementing those units in the form of grid services offered by third party providers. More specifically, this paper outlines a grid-based CSCL system having these features and illustrates its potential scope and applicability by means of a sample collaborative learning scenario.
Resumo:
Purpose. The aim of this study was to identify new surfactants with low skin irritant properties for use in pharmaceutical and cosmetic formulations, employing cell culture as an alternative method to in vivo testing. In addition, we sought to establish whether potential cytotoxic properties were related to the size of the counterions bound to the surfactants. Methods. Cytotoxicity was assessed in the mouse fibroblast cell line 3T6, and the human keratinocyte cell line NCTC 2544, using the MTT assay and uptake of the vital dye neutral red 24 h after dosing (NRU). Results. Lysine-derivative surfactants showed higher IC50s than did commercial anionic irritant compounds such as sodium dodecyl sulphate, proving to be no more harmful than amphoteric betaines. The aggressiveness of the surfactants depended upon the size of their constituent counterions: surfactants associated with lighter counterions showed a proportionally higher aggressivity than those with heavier ones. Conclusions. Synthetic lysine-derivative anionic surfactants are less irritant than commercial surfactants such as sodium dodecyl sulphate and Hexadecyltrimethylammonium bromide and are similar to Betaines. These surfactants may offer promising applications in pharmaceutical and cosmetic preparations, representing a potential alternative to commercial anionic surfactants as a result of their low irritancy potential.
Resumo:
The purpose of the study was to evaluate the shear bond strength of stainless steel orthodontic brackets directly bonded to extracted human premolar teeth. Fifty teeth were randomly divided into ¿ve groups: (1) System One (chemically cured composite resin), (2) Light Bond (light-cured composite resin), (3) Vivaglass Cem (self-curing glass ionomer cement), (4) Fuji Ortho LC (light-cured glass ionomer cement) used after 37% orthophosphoric acid¿etching of enamel (5) Fuji Ortho LC without orthophosphoric acid¿etching. The brackets were placed on the buccal and lingual surfaces of each tooth, and the specimens were stored in distilled water (24 hours) at 378C and thermocycled. Teeth were mounted on acrylic block frames, and brackets were debonded using an Instron machine. Shear bond strength values at fracture (Nw)were recorded. ANOVA and Student-Newman-Keuls multiple comparison tests were performed (P , .05). Bonding failure site was recorded by stereomicroscope and analyzed by Chi-square test, selected specimens of each group were observed by scanning electron microscope. System One attained the highest bond strength. Light Bond and Fuji Ortho LC, when using an acid-etching technique, obtained bond strengths that were within the range of estimated bond strength values for successful clinical bonding. Fuji Ortho LC and Vivaglass Cem left an almost clean enamel surface after debracketing.
Resumo:
Este estudio realiza un investigación empírica comparando las dificultades que se derivan de la utilización del valor razonable (VR) y del coste histórico (CH) en el sector agrícola. Se analiza también la fiabilidad de ambos métodos de valoración para la interpretación de la información y la toma de decisiones por parte de los agentes que actúan en el sector. Mediante un experimento realizado con estudiantes, agricultores y contables que operan en el sector agrícola, se halla que estos tienen más dificultades, cometen mayores errores e interpretan peor la información contable realizada a CH que la realizada a VR. Entrevistas en profundidad con agricultores y contables agrícolas desvelan prácticas contables defectuosas derivadas de la necesidad de aplicar el CH en el sector en España. Dadas las complejidades del cálculo del coste de los activos biológicos y el predominio de pequeñas explotaciones en el sector en los países occidentales avanzados, el estudio concluye que la contabilidad a VR constituye una mejoría de utilización y desarrollo de la contabilidad en el sector que la confeccionada a CH. Asimismo, el CH transmite una peor representación de la situación real de las explotaciones agrícolas.
Resumo:
Abstract: Asthma prevalence in children and adolescents in Spain is 10-17%. It is the most common chronic illness during childhood. Prevalence has been increasing over the last 40 years and there is considerable evidence that, among other factors, continued exposure to cigarette smoke results in asthma in children. No statistical or simulation model exist to forecast the evolution of childhood asthma in Europe. Such a model needs to incorporate the main risk factors that can be managed by medical authorities, such as tobacco (OR = 1.44), to establish how they affect the present generation of children. A simulation model using conditional probability and discrete event simulation for childhood asthma was developed and validated by simulating realistic scenario. The parameters used for the model (input data) were those found in the bibliography, especially those related to the incidence of smoking in Spain. We also used data from a panel of experts from the Hospital del Mar (Barcelona) related to actual evolution and asthma phenotypes. The results obtained from the simulation established a threshold of a 15-20% smoking population for a reduction in the prevalence of asthma. This is still far from the current level in Spain, where 24% of people smoke. We conclude that more effort must be made to combat smoking and other childhood asthma risk factors, in order to significantly reduce the number of cases. Once completed, this simulation methodology can realistically be used to forecast the evolution of childhood asthma as a function of variation in different risk factors.
Resumo:
Preparation of (S)-1-chloro-2-octanol and (S)-1-bromo-2-octanol was carried out by the enzymatic hydrolysis of halohydrin palmitates using biocatalysts. Halohydrin palmitates were prepared by various methods from palmitic acid and 1,2-octanediol. A tandem hydrolysis was carried out using lipases from Candida antarctica (Novozym® 435), Rhizomucor miehei (Lipozyme IM), and “resting cells” from a Rhizopus oryzae strain that was not mycotoxigenic. The influence of the enzyme and the reaction medium on the selective hydrolysis of isomeric mixtures of halohydrin esters is described. Novozym® 435 allowed preparation of (S)-1-chloro-2-octanol and (S)-1-bromo-2-octanol after 1–3 h ofreaction at 40 °C in [BMIM][PF6].
Resumo:
This article introduces a new interface for T-Coffee, a consistency-based multiple sequence alignment program. This interface provides an easy and intuitive access to the most popular functionality of the package. These include the default T-Coffee mode for protein and nucleic acid sequences, the M-Coffee mode that allows combining the output of any other aligners, and template-based modes of T-Coffee that deliver high accuracy alignments while using structural or homology derived templates. These three available template modes are Expresso for the alignment of protein with a known 3D-Structure, R-Coffee to align RNA sequences with conserved secondary structures and PSI-Coffee to accurately align distantly related sequences using homology extension. The new server benefits from recent improvements of the T-Coffee algorithm and can align up to 150 sequences as long as 10 000 residues and is available from both http://www.tcoffee.org and its main mirror http://tcoffee.crg.cat.
Resumo:
Flood simulation studies use spatial-temporal rainfall data input into distributed hydrological models. A correct description of rainfall in space and in time contributes to improvements on hydrological modelling and design. This work is focused on the analysis of 2-D convective structures (rain cells), whose contribution is especially significant in most flood events. The objective of this paper is to provide statistical descriptors and distribution functions for convective structure characteristics of precipitation systems producing floods in Catalonia (NE Spain). To achieve this purpose heavy rainfall events recorded between 1996 and 2000 have been analysed. By means of weather radar, and applying 2-D radar algorithms a distinction between convective and stratiform precipitation is made. These data are introduced and analyzed with a GIS. In a first step different groups of connected pixels with convective precipitation are identified. Only convective structures with an area greater than 32 km2 are selected. Then, geometric characteristics (area, perimeter, orientation and dimensions of the ellipse), and rainfall statistics (maximum, mean, minimum, range, standard deviation, and sum) of these structures are obtained and stored in a database. Finally, descriptive statistics for selected characteristics are calculated and statistical distributions are fitted to the observed frequency distributions. Statistical analyses reveal that the Generalized Pareto distribution for the area and the Generalized Extreme Value distribution for the perimeter, dimensions, orientation and mean areal precipitation are the statistical distributions that best fit the observed ones of these parameters. The statistical descriptors and the probability distribution functions obtained are of direct use as an input in spatial rainfall generators.
Resumo:
Differentiation between photoallergenic and phototoxic reactions induced by low molecular weight compounds represents a current problem. The use of eratinocytes as a potential tool for the detection of photoallergens as opposed to photoirritants is considered an interesting strategy for developing in vitro methods. We have previously demonstrated the possibility to use the human keratinocyte cell line NCTC2455 and the production of interleukin-18 (IL-18) to screen low molecular weight sensitizers. The purpose of this work was to explore the possibility to use the NCTC2544 assay to identify photoallergens and discriminate from phototoxic chemicals. First, we identified suitable condition of UV-irradiation (3.5 J/cm2) by investigating the effect of UVAirradiation on intracellular IL-18 on untreated or chloropromazine (a representative phototoxic compound)- treated NCTC2544 cells. Then, the effect of UVA-irradiation over NCTC2544 cells treated with increasing concentrations of 15 compounds including photoallergens (benzophenone, 4-ter-butyl-4-methoxydibenzoylmethane, 2-ethylexyl-p-methoxycinnamate, ketoprofen, 6-methylcumarin); photoirritant and photoallergen (4-aminobenzoic acid, chlorpromazine, promethazine); photoirritants (acridine, ibuprofen, 8-methoxypsoralen, retinoic acid); and negative compounds (lactic acid, SDS and p-phenilendiamine) was investigated. Twenty-four hours after exposure, cytotoxicity was evaluated by the MTT assay or LDH leakage, while ELISA was used to measure the production of IL-18. At the maximal concentration assayed with non-cytotoxic effects (CV80 under irradiated condition), all tested photoallergens induced a significant and a dose-dependent increase of intracellular IL-18 following UVA irratiation, whereas photoirritants failed. We suggest that this system may be useful for the in vitro evaluation of the photoallergic potential of chemicals.
Resumo:
This paper presents empirical research comparing the accounting difficulties that arise from the use of two valuation methods for biological assets, fair value (FV) and historical cost (HC) accounting, in the agricultural sector. It also compares how reliable each valuation method is in the decision-making process of agents within the sector. By conducting an experiment with students, farmers, and accountants operating in the agricultural sector, we find that they have more difficulties, make larger miscalculations and make poorer judgements with HC accounting than with FV accounting. In-depth interviews uncover flawed accounting practices in the agricultural sector in Spain in order to meet HC accounting requirements. Given the complexities of cost calculation for biological assets and the predominance of small family business units in advanced Western countries, the study concludes that accounting can be more easily applied in the agricultural sector under FV than HC accounting, and that HC conveys a less accurate grasp of the real situation of a farm.
Resumo:
Cooperative transmission can be seen as a "virtual" MIMO system, where themultiple transmit antennas are in fact implemented distributed by the antennas both at the source and the relay terminal. Depending on the system design, diversity/multiplexing gainsare achievable. This design involves the definition of the type of retransmission (incrementalredundancy, repetition coding), the design of the distributed space-time codes, the errorcorrecting scheme, the operation of the relay (decode&forward or amplify&forward) and thenumber of antennas at each terminal. Proposed schemes are evaluated in different conditionsin combination with forward error correcting codes (FEC), both for linear and near-optimum(sphere decoder) receivers, for its possible implementation in downlink high speed packetservices of cellular networks. Results show the benefits of coded cooperation over directtransmission in terms of increased throughput. It is shown that multiplexing gains areobserved even if the mobile station features a single antenna, provided that cell wide reuse of the relay radio resource is possible.
Resumo:
Phase encoded nano structures such as Quick Response (QR) codes made of metallic nanoparticles are suggested to be used in security and authentication applications. We present a polarimetric optical method able to authenticate random phase encoded QR codes. The system is illuminated using polarized light and the QR code is encoded using a phase-only random mask. Using classification algorithms it is possible to validate the QR code from the examination of the polarimetric signature of the speckle pattern. We used Kolmogorov-Smirnov statistical test and Support Vector Machine algorithms to authenticate the phase encoded QR codes using polarimetric signatures.