973 resultados para Default
Resumo:
Layering is a widely used method for structuring data in CAD-models. During the last few years national standardisation organisations, professional associations, user groups for particular CAD-systems, individual companies etc. have issued numerous standards and guidelines for the naming and structuring of layers in building design. In order to increase the integration of CAD data in the industry as a whole ISO recently decided to define an international standard for layer usage. The resulting standard proposal, ISO 13567, is a rather complex framework standard which strives to be more of a union than the least common denominator of the capabilities of existing guidelines. A number of principles have been followed in the design of the proposal. The first one is the separation of the conceptual organisation of information (semantics) from the way this information is coded (syntax). The second one is orthogonality - the fact that many ways of classifying information are independent of each other and can be applied in combinations. The third overriding principle is the reuse of existing national or international standards whenever appropriate. The fourth principle allows users to apply well-defined subsets of the overall superset of possible layernames. This article describes the semantic organisation of the standard proposal as well as its default syntax. Important information categories deal with the party responsible for the information, the type of building element shown, whether a layer contains the direct graphical description of a building part or additional information needed in an output drawing etc. Non-mandatory information categories facilitate the structuring of information in rebuilding projects, use of layers for spatial grouping in large multi-storey projects, and storing multiple representations intended for different drawing scales in the same model. Pilot testing of ISO 13567 is currently being carried out in a number of countries which have been involved in the definition of the standard. In the article two implementations, which have been carried out independently in Sweden and Finland, are described. The article concludes with a discussion of the benefits and possible drawbacks of the standard. Incremental development within the industry, (where ”best practice” can become ”common practice” via a standard such as ISO 13567), is contrasted with the more idealistic scenario of building product models. The relationship between CAD-layering, document management product modelling and building element classification is also discussed.
Resumo:
Financing trade between economic agents located in different countries is affected by many types of risks, resulting from incomplete information about the debtor, the problems of enforcing international contracts, or the prevalence of political and financial crises. Trade is important for economic development and the availability of trade finance is essential, especially for developing countries. Relatively few studies treat the topic of political risk, particularly in the context of international lending. This thesis explores new ground to identify links between political risk and international debt defaults. The core hypothesis of the study is that the default probability of debt increases with increasing political risk in the country of the borrower. The thesis consists of three essays that support the hypothesis from different angles of the credit evaluation process. The first essay takes the point of view of an international lender assessing the credit risk of a public borrower. The second investigates creditworthiness assessment of companies. The obtained results are substantiated in the third essay that deals with an extensive political risk survey among finance professionals in developing countries. The financial instruments of core interest are export credit guaranteed debt initiated between the Export Credit Agency of Finland and buyers in 145 countries between 1975 and 2006. Default events of the foreign credit counterparts are conditioned on country-specific macroeconomic variables, corporate-specific accounting information as well as political risk indicators from various international sources. Essay 1 examines debt issued to government controlled institutions and conditions public default events on traditional macroeconomic fundamentals, in addition to selected political and institutional risk factors. Confirming previous research, the study finds country indebtedness and the GDP growth rate to be significant indicators of public default. Further, it is shown that public defaults respond to various political risk factors. However, the impact of the risk varies between countries at different stages of economic development. Essay 2 proceeds by investigating political risk factors as conveivable drivers of corporate default and uses traditional accounting variables together with new political risk indicators in the credit evaluation of private debtors. The study finds links between corporate default and leverage, as well as between corporate default and the general investment climate and measeures of conflict in the debtor country. Essay 3 concludes the thesis by offering survey evidence on the impact of political risk on debt default, as perceived and experienced by 103 finance professionals in 38 developing countries. Taken together, the results of the thesis suggest that various forms of political risk are associated with international debt defaults and continue to pose great concerns for both international creditors and borrowers in developing countries. The study provides new insights on the importance of variable selection in country risk analysis, and shows how political risk is actually perceived and experienced in the riskier, often lower income countries of the global economy.
Resumo:
In this paper we develop and numerically explore the modeling heuristic of using saturation attempt probabilities as state dependent attempt probabilities in an IEEE 802.11e infrastructure network carrying packet telephone calls and TCP controlled file downloads, using Enhanced Distributed Channel Access (EDCA). We build upon the fixed point analysis and performance insights in [1]. When there are a certain number of nodes of each class contending for the channel (i.e., have nonempty queues), then their attempt probabilities are taken to be those obtained from saturation analysis for that number of nodes. Then we model the system queue dynamics at the network nodes. With the proposed heuristic, the system evolution at channel slot boundaries becomes a Markov renewal process, and regenerative analysis yields the desired performance measures.The results obtained from this approach match well with ns2 simulations. We find that, with the default IEEE 802.11e EDCA parameters for AC 1 and AC 3, the voice call capacity decreases if even one file download is initiated by some station. Subsequently, reducing the voice calls increases the file download capacity almost linearly (by 1/3 Mbps per voice call for the 11 Mbps PHY).
Resumo:
The growing interest for sequencing with higher throughput in the last decade has led to the development of new sequencing applications. This thesis concentrates on optimizing DNA library preparation for Illumina Genome Analyzer II sequencer. The library preparation steps that were optimized include fragmentation, PCR purification and quantification. DNA fragmentation was performed with focused sonication in different concentrations and durations. Two column based PCR purification method, gel matrix method and magnetic bead based method were compared. Quantitative PCR and gel electrophoresis in a chip were compared for DNA quantification. The magnetic bead purification was found to be the most efficient and flexible purification method. The fragmentation protocol was changed to produce longer fragments to be compatible with longer sequencing reads. Quantitative PCR correlates better with the cluster number and should thus be considered to be the default quantification method for sequencing. As a result of this study more data have been acquired from sequencing with lower costs and troubleshooting has become easier as qualification steps have been added to the protocol. New sequencing instruments and applications will create a demand for further optimizations in future.
Resumo:
The Government of India has announced the Greening India Mission (GIM) under the National Climate Change Action Plan. The Mission aims to restore and afforest about 10 mha over the period 2010-2020 under different sub-missions covering moderately dense and open forests, scrub/grasslands, mangroves, wetlands, croplands and urban areas. Even though the main focus of the Mission is to address mitigation and adaptation aspects in the context of climate change, the adaptation component is inadequately addressed. There is a need for increased scientific input in the preparation of the Mission. The mitigation potential is estimated by simply multiplying global default biomass growth rate values and area. It is incomplete as it does not include all the carbon pools, phasing, differing growth rates, etc. The mitigation potential estimated using the Comprehensive Mitigation Analysis Process model for the GIM for the year 2020 has the potential to offset 6.4% of the projected national greenhouse gas emissions, compared to the GIM estimate of only 1.5%, excluding any emissions due to harvesting or disturbances. The selection of potential locations for different interventions and species choice under the GIM must be based on the use of modelling, remote sensing and field studies. The forest sector provides an opportunity to promote mitigation and adaptation synergy, which is not adequately addressed in the GIM. Since many of the interventions proposed are innovative and limited scientific knowledge exists, there is need for an unprecedented level of collaboration between the research institutions and the implementing agencies such as the Forest Departments, which is currently non-existent. The GIM could propel systematic research into forestry and climate change issues and thereby provide global leadership in this new and emerging science.
Resumo:
The paper describes the sensitivity of the simulated precipitation to changes in convective relaxation time scale (TAU) of Zhang and McFarlane (ZM) cumulus parameterization, in NCAR-Community Atmosphere Model version 3 (CAM3). In the default configuration of the model, the prescribed value of TAU, a characteristic time scale with which convective available potential energy (CAPE) is removed at an exponential rate by convection, is assumed to be 1 h. However, some recent observational findings suggest that, it is larger by around one order of magnitude. In order to explore the sensitivity of the model simulation to TAU, two model frameworks have been used, namely, aqua-planet and actual-planet configurations. Numerical integrations have been carried out by using different values of TAU, and its effect on simulated precipitation has been analyzed. The aqua-planet simulations reveal that when TAU increases, rate of deep convective precipitation (DCP) decreases and this leads to an accumulation of convective instability in the atmosphere. Consequently, the moisture content in the lower-and mid-troposphere increases. On the other hand, the shallow convective precipitation (SCP) and large-scale precipitation (LSP) intensify, predominantly the SCP, and thus capping the accumulation of convective instability in the atmosphere. The total precipitation (TP) remains approximately constant, but the proportion of the three components changes significantly, which in turn alters the vertical distribution of total precipitation production. The vertical structure of moist heating changes from a vertically extended profile to a bottom heavy profile, with the increase of TAU. Altitude of the maximum vertical velocity shifts from upper troposphere to lower troposphere. Similar response was seen in the actual-planet simulations. With an increase in TAU from 1 h to 8 h, there was a significant improvement in the simulation of the seasonal mean precipitation. The fraction of deep convective precipitation was in much better agreement with satellite observations.
Resumo:
A thiamin-binding protein was isolated and characterized from chicken egg white by affinity chromatography on thiamin pyrophosphate coupled to aminoethyl-Sepharose. The high specificity of interaction between the thiamin-binding protein and the riboflavin-binding protein of the egg white, with a protein/protein molar ratio of 1.0, led to the development of an alternative procedure that used the riboflavin-binding protein immobilized on CNBr-activated Sepharose as the affinity matrix. The thiamin-binding protein thus isolated was homogeneous by the criteria of polyacrylamide-gel disc electrophoresis, double immunodiffusion and sodium dodecyl sulphate/polyacrylamide-gel electrophoresis, had a mol.wt. of 38,000 +/- 2000 and was not a glycoprotein. The protein bound [14C]thiamin was a molar ratio of 1.0, with dissociation constant (Kd) 0.3 micrometer.
Resumo:
1. Mevalonate pyrophosphate decarboxylase of rat liver is inhibited by various phenyl and phenolic acids. 2. Some of the phenyl and phenolic acids also inhibited mevalonate phosphate kinase. 3. Compounds with the phenyl-vinyl structure were more effective. 4. Kinetic studies showed that some of the phenolic acids compete with the substrates, mevalonate 5-phosphate and mevalonate 5-pyrophosphate, whereas others inhibit umcompetitively. 5. Dihydroxyphenyl and trihydroxyphenyl compounds and p-chlorophenoxyisobutyrate, a hypocholesterolaemic drug, had no effect on these enzymes. 6. Of the three mevalonate-metabolizing enzymes, mevalonate pyrophosphate decarboxylase has the lowest specific activity and is probably the rate-determining step in this part of the pathway.
Resumo:
We present WebGeSTer DB, the largest database of intrinsic transcription terminators (http://pallab.serc.iisc.ernet.in/gester). The database comprises of a million terminators identified in 1060 bacterial genome sequences and 798 plasmids. Users can obtain both graphic and tabular results on putative terminators based on default or user-defined parameters. The results are arranged in different tiers to facilitate retrieval, as per the specific requirements. An interactive map has been incorporated to visualize the distribution of terminators across the whole genome. Analysis of the results, both at the whole-genome level and with respect to terminators downstream of specific genes, offers insight into the prevalence of canonical and non-canonical terminators across different phyla. The data in the database reinforce the paradigm that intrinsic termination is a conserved and efficient regulatory mechanism in bacteria. Our database is freely accessible.
Resumo:
The DNA content in the silk glands of the non-mulberry silkworm Philosamia ricini increases continuously during the fourth and fifth instars of larval development indicating high levels of DNA replication in this terminally differentiated tissue. Concomitantly, the DNA polymerase alpha activity also increases in the middle and the posterior silk glands during development, reaching maximal levels in the middle of the fifth larval instar. A comparable level of DNA polymerase delta/epsilon was also observed in this highly replicative tissue. The DNA polymerase alpha-primase complex from the silk glands of P. ricini has been purified to homogeneity by conventional column chromatography as well as by immunoaffinity techniques. The molecular mass of the native enzyme is 560 kDa and the enzyme comprises six non-identical subunits. The identity of the enzyme as DNA polymerase alpha has been established by its sensitivity to inhibitors such as aphidicolin, N-ethylmaleimide, butylphenyl-dGTP, butylanilino-dATP and antibodies to polymerase alpha. The enzyme possesses primase activity capable of initiating DNA synthesis on single-stranded DNA templates. The tight association of polymerase and primase activities at a constant ratio of 6:1 is observed through all the purification steps. The 180 kDa subunit harbours the polymerase activity, while the primase activity is associated with the 45 kDa subunit.
Resumo:
Chicken egg yolk biotin-binding protein-I (BBP-I) has been purified to homogeneity along with the tetrameric BBP-II by a common protocol. The purification includes delipidation of egg yolk by butanol extraction, DEAE-Sephacel chromatography, treatment with guanidinium chloride and biotin-aminohexyl-Sepharose affinity chromatography. The identity of purified BBP-I was ascertained by its physicochemical properties as well as by its immunological cross-reactivity and precursor-product relationship with BBP-II.
Resumo:
A radical cyclization based methodology has been applied for the formal total synthesis of (+/-)-enterolactone (1), the first lignan isolated from human source. Bromoacetalization reaction of the cinnamyl alcohols 7 and 13 using ethyl vinyl ether and NBS, generated the bromoacetals 8 and 15. The 5-exo-trig radical cyclization reaction of the bromoacetals 8 and 15 with in situ generated catalytic tri-a-butyltin hydride and AIBN furnished a 3 : 2 diastereomeric mixture of the cyclic acetals 9 and 16. Sonochemically accelerated Jones oxidation of the cyclic acetals 9 and 16 yielded the gamma-butyrolactones 10 and 12 completing the formal total synthesis of (+/-)-enterolactone. Alternatively radical cyclization of the bromoacetate 17 furnished a 1 : 2 mixture of the lactone 10 and the reduced product 18.
Resumo:
With the emergence of Internet, the global connectivity of computers has become a reality. Internet has progressed to provide many user-friendly tools like Gopher, WAIS, WWW etc. for information publishing and access. The WWW, which integrates all other access tools, also provides a very convenient means for publishing and accessing multimedia and hypertext linked documents stored in computers spread across the world. With the emergence of WWW technology, most of the information activities are becoming Web-centric. Once the information is published on the Web, a user can access this information from any part of the world. A Web browser like Netscape or Internet Explorer is used as a common user interface for accessing information/databases. This will greatly relieve a user from learning the search syntax of individual information systems. Libraries are taking advantage of these developments to provide access to their resources on the Web. CDS/ISIS is a very popular bibliographic information management software used in India. In this tutorial we present details of integrating CDS/ISIS with the WWW. A number of tools are now available for making CDS/ISIS database accessible on the Internet/Web. Some of these are 1) the WAIS_ISIS Server. 2) the WWWISIS Server 3) the IQUERY Server. In this tutorial, we have explained in detail the steps involved in providing Web access to an existing CDS/ISIS database using the freely available software, WWWISIS. This software is developed, maintained and distributed by BIREME, the Latin American & Caribbean Centre on Health Sciences Information. WWWISIS acts as a server for CDS/ISIS databases in a WWW client/server environment. It supports functions for searching, formatting and data entry operations over CDS/ISIS databases. WWWISIS is available for various operating systems. We have tested this software on Windows '95, Windows NT and Red Hat Linux release 5.2 (Appolo) Kernel 2. 0. 36 on an i686. The testing was carried out using IISc's main library's OPAC containing more than 80,000 records and Current Contents issues (bibliographic data) containing more than 25,000 records. WWWISIS is fully compatible with CDS/ISIS 3.07 file structure. However, on a system running Unix or its variant, there is no guarantee of this compatibility. It is therefore safe to recreate the master and the inverted files, using utilities provided by BIREME, under Unix environment.
Resumo:
In this paper we develop and numerically explore the modeling heuristic of using saturation attempt probabilities as state dependent attempt probabilities in an IEEE 802.11e infrastructure network carrying packet telephone calls and TCP controlled file downloads, using enhanced distributed channel access (EDCA). We build upon the fixed point analysis and performance insights. When there are a certain number of nodes of each class contending for the channel (i.e., have nonempty queues), then their attempt probabilities are taken to be those obtained from saturation analysis for that number of nodes. Then we model the system queue dynamics at the network nodes. With the proposed heuristic, the system evolution at channel slot boundaries becomes a Markov renewal process, and regenerative analysis yields the desired performance measures. The results obtained from this approach match well with ns2 simulations. We find that, with the default IEEE 802.11e EDCA parameters for AC 1 and AC 3, the voice call capacity decreases if even one file download is initiated by some station. Subsequently, reducing the voice calls increases the file download capacity almost linearly (by 1/3 Mbps per voice call for the 11 Mbps PHY)