899 resultados para Deep-focus earthquake
Resumo:
An efficient method is developed for an iterative solution of the Poisson and Schro¿dinger equations, which allows systematic studies of the properties of the electron gas in linear deep-etched quantum wires. A much simpler two-dimensional (2D) approximation is developed that accurately reproduces the results of the 3D calculations. A 2D Thomas-Fermi approximation is then derived, and shown to give a good account of average properties. Further, we prove that an analytic form due to Shikin et al. is a good approximation to the electron density given by the self-consistent methods.
Resumo:
This thesis Entitled distribution ,diversity and biology of deep-sea fishes the indian Eez.Fishing rights and responsibilities it entails in the deep-sea sector has been a vexed issue since the mid-nineties and various stakeholders have different opinion on the modalities of harnessing the marine fisheries wealth, especially from the oceanic and deeper waters. The exploitation and utilization of these esources requires technology development and upgradation in harvest and post-harvest areas; besides shore infrastructure for berthing, handling, storing and processing facilities. At present, although deep-sea fishes don’t have any ready market in our country it can be converted into value added products. Many problems have so far confronted the deep-sea fishing sector not allowing it to reach its full potential. Hence, there should be a sound deep-sea fishing policy revolving round the upgradation of the capabilities of small scale fishermen, who have the inherent skills but do not have adequate support to develop themselves and to acquire vessels having the capability to operate in farther and deeper waters. Prospects for the commercial exploitation and utilization of deep-sea fishes were analyzed using SWOL analysis.
Resumo:
Reducing fishing pressure in coastal waters is the need of the day in the Indian marine fisheries sector of the country which is fast changing from a mere vocational activity to a capital intensive industry. It requires continuous monitoring of the resource exploitation through a scientifically acceptable methodology, data on production of each species stock, the number and characteristics of the fishing gears of the fleet, various biological characteristics of each stock, the impact of fishing on the environment and the role of fishery—independent on availability and abundance. Besides this, there are issues relating to capabilities in stock assessment, taxonomy research, biodiversity, conservation and fisheries management. Generation of reliable data base over a fixed time frame, their analysis and interpretation are necessary before drawing conclusions on the stock size, maximum sustainable yield, maximum economic yield and to further implement various fishing regulatory measures. India being a signatory to several treaties and conventions, is obliged to carry out assessments of the exploited stocks and manage them at sustainable levels. Besides, the nation is bound by its obligation of protein food security to people and livelihood security to those engaged in marine fishing related activities. Also, there are regional variabilities in fishing technology and fishery resources. All these make it mandatory for India to continue and strengthen its marine capture fisheries research in general and deep sea fisheries in particular. Against this background, an attempt is made to strengthen the deep sea fish biodiversity and also to generate data on the distribution, abundance, catch per unit effort of fishery resources available beyond 200 m in the EEZ of southwest coast ofIndia and also unravel some of the aspects of life history traits of potentially important non conventional fish species inhabiting in the depth beyond 200 m. This study was carried out as part of the Project on Stock Assessment and Biology of Deep Sea Fishes of Indian EEZ (MoES, Govt. of India).
Resumo:
The aim of this paper is to expand on previous quantitative and qualitative research into the use of electronic information resources and its impact on the information behaviour of academics at Catalan universities.
Resumo:
Hepcidin is cysteine-rich short peptide of innate immune system of fishes, equipped to perform prevention and proliferation of invading pathogens like bacteria and viruses by limiting iron availability and activating intracellular cascades. Hepcidins are diverse in teleost fishes, due to the varied aquatic environments including exposure to pathogens, oxygenation and iron concentration. In the present study, we report a 87-amino acid (aa) preprohepcidin (Hepc-CB1) with a signal peptide of 24 aa, a prodomain of 39 aa and a bioactive mature peptide of 24 aa from the gill mRNA transcripts of the deep-sea fish spinyjaw greeneye, Chlorophthalmus bicornis. Molecular characterisation and phylogenetic analysis categorised the peptide to HAMP2-like group with a mature peptide of 2.53 kDa; a net positive charge (?3) and capacity to form b-hairpin-like structure configured by 8 conserved cysteines. The present work provides new insight into the mass gene duplication events and adaptive evolution of hepcidin isoforms with respect to environmental influences and positive Darwinian selection. This work reports a novel hepcidin isoform under the group HAMP2 from a nonacanthopterygian deep-sea fish, C. bicornis
Resumo:
Elasmobranchs comprising sharks, skates and rays have traditionally formed an important fishery along the Indian coast. Since 2000, Indian shark fishermen are shifting their fishing operations to deeper/oceanic waters by conducting multi-day fishing trips, which has resulted in considerable changes in the species composition of the landings vis- a-vis those reported during the 1980’s and 1990’s. A case study at Cochin Fisheries Harbour (CFH), southwest coast of India during 2008-09 indicated that besides the existing gillnet–cum- hooks & line and longline fishery for sharks, a targeted fishery at depths >300-1000 m for gulper sharks (Centrophorus spp.) has emerged. In 2008, the chondrichthyan landings (excluding batoids) were mainly constituted by offshore and deep-sea species such as Alopias superciliosus (24.2%), Carcharhinus limbatus (21.1%), Echinorhinus brucus (8.2%), Galeocerdo cuvier (5.4%), Centrophorus spp. (7.3%) and Neoharriotta pinnata (4.2%) while the contribution by the coastal species such as Sphyrna lewini (14.8%), Carcharhinus sorrah (1.4%) and other Carcharhinus spp. has reduced. Several deep-sea sharks previously not recorded in the landings at Cochin were also observed during 2008-09. It includes Hexanchus griseus, Deania profundorum, Zameus squamulosus and Pygmy false catshark (undescribed) which have been reported for the first time from Indian waters. Life history characteristics of the major fished species are discussed in relation to the fishery and its possible impacts on the resource
Resumo:
The present study is the first attempt to understand population characteristics of the deep-sea pandalid shrimp, P. quasigrandis and to assess the status of these resources off Kerala coast.Total mortality coefficient (Z) of P. quasigrandis estimated by various methods.Natural mortality coefficient (M) calculated was 0.65 and 1.02 by Pauly‟sempirical formula and Srinaths‟s formula respectively The deep-sea shrimp P. quasigrandis exploited from the present fishing ground and their monetary return has started showing a declining trend. By observing the current yield and economic return, there is no further scope for increasing the catch from the present fishing ground. The study indicated that majority of the deep-sea shrimp trawlers, especially targeted for pandalid shrimps still concentrated off Kollam area (Quilon Bank). Even though researchers had located several potential deep-sea fishing grounds based on exploratory surveys in Indian EEZ , fishermen are unaware of these fishing grounds located and hence sharing the information about new potential deep-sea fishing grounds could avert the possible stock decline due to the intensive targeted deep-sea shrimp fishery in the Quilon Bank. Hence, the present study recommended that part of the effort from existing fishing grounds may be shifted to newly located deep-sea fishing grounds which will help in a sustainableexploitation of deep-sea resources off Kerala coast.
Resumo:
Available information on abundance of myctophids and their utilisation indicate that there is excellent scope for development of myctophid fisheries in Indian Ocean. Most of the conventional fish stocks have reached a state of full exploitation or over-exploitation. Hence there is need to locate new and conventional fishery resources in order to fill in the supply-demand gap, in the face of increasing demand for fish. Information on length-weight relationship, age and growth, spawning season, fecundity and age at maturity and information on bycatch discards are required for sustainable utilization of myctophid resource in the Indian Ocean
Resumo:
The process of developing software that takes advantage of multiple processors is commonly referred to as parallel programming. For various reasons, this process is much harder than the sequential case. For decades, parallel programming has been a problem for a small niche only: engineers working on parallelizing mostly numerical applications in High Performance Computing. This has changed with the advent of multi-core processors in mainstream computer architectures. Parallel programming in our days becomes a problem for a much larger group of developers. The main objective of this thesis was to find ways to make parallel programming easier for them. Different aims were identified in order to reach the objective: research the state of the art of parallel programming today, improve the education of software developers about the topic, and provide programmers with powerful abstractions to make their work easier. To reach these aims, several key steps were taken. To start with, a survey was conducted among parallel programmers to find out about the state of the art. More than 250 people participated, yielding results about the parallel programming systems and languages in use, as well as about common problems with these systems. Furthermore, a study was conducted in university classes on parallel programming. It resulted in a list of frequently made mistakes that were analyzed and used to create a programmers' checklist to avoid them in the future. For programmers' education, an online resource was setup to collect experiences and knowledge in the field of parallel programming - called the Parawiki. Another key step in this direction was the creation of the Thinking Parallel weblog, where more than 50.000 readers to date have read essays on the topic. For the third aim (powerful abstractions), it was decided to concentrate on one parallel programming system: OpenMP. Its ease of use and high level of abstraction were the most important reasons for this decision. Two different research directions were pursued. The first one resulted in a parallel library called AthenaMP. It contains so-called generic components, derived from design patterns for parallel programming. These include functionality to enhance the locks provided by OpenMP, to perform operations on large amounts of data (data-parallel programming), and to enable the implementation of irregular algorithms using task pools. AthenaMP itself serves a triple role: the components are well-documented and can be used directly in programs, it enables developers to study the source code and learn from it, and it is possible for compiler writers to use it as a testing ground for their OpenMP compilers. The second research direction was targeted at changing the OpenMP specification to make the system more powerful. The main contributions here were a proposal to enable thread-cancellation and a proposal to avoid busy waiting. Both were implemented in a research compiler, shown to be useful in example applications, and proposed to the OpenMP Language Committee.
Resumo:
Definition und technische Beschreibung der Deep Packet Inspection-Technologie, Motive für ihren Einsatz bei Internetprovidern und die rechtliche Würdigung dieses Einsatzes anhand des geltenden deutschen Straf-, Daten- und Urheberrechts
Resumo:
Die Arbeit untersucht ein Format der modernen Architektur: Tiefe Geschossbauten. Diese werden definiert als kompakte Gebäude mit mindestens vier Geschossebenen von mindestens 25 Metern Seitenlänge in beiden Richtungen ("tiefe Grundrisse", "deep plans") ohne zentralen Kern oder zentrales Atrium. Anstelle der Nutzung wird die Gebäudetiefe als entscheidender typologischer Parameter herausgearbeitet. Der einheitlich durchgehende Abbildungsmaßstab von 1:1000 für Grundrisse und Schnitte erlaubt den unmittelbaren visuellen Vergleich innerhalb der vorgestellten Gebäudereferenzen. Von den drei Teilen der Arbeit betrachtet Teil I die Referenzen der Zeit zwischen 1890 und 1990. Teil II untersucht die Referenzen seit 1990. Während für den ersten Teil eine chronologische Gliederung gewählt wurde, werden die Referenzen des zweiten Teils unter morphologischem Blickwinkel gruppiert. Dieser Wechsel der Perspektive signalisiert, wie in Teil III weiter ausgeführt wird, dass die neueren Referenzen als Entfaltung von Möglichkeiten, die in früheren Phasen der architektonischen Moderne angelegt waren, interpretiert werden können. Die Arbeit liegt somit in der Schnittmenge von Architekturgeschichte, Gebäudekunde und Entwurfstheorie.
Resumo:
This thesis presents a perceptual system for a humanoid robot that integrates abilities such as object localization and recognition with the deeper developmental machinery required to forge those competences out of raw physical experiences. It shows that a robotic platform can build up and maintain a system for object localization, segmentation, and recognition, starting from very little. What the robot starts with is a direct solution to achieving figure/ground separation: it simply 'pokes around' in a region of visual ambiguity and watches what happens. If the arm passes through an area, that area is recognized as free space. If the arm collides with an object, causing it to move, the robot can use that motion to segment the object from the background. Once the robot can acquire reliable segmented views of objects, it learns from them, and from then on recognizes and segments those objects without further contact. Both low-level and high-level visual features can also be learned in this way, and examples are presented for both: orientation detection and affordance recognition, respectively. The motivation for this work is simple. Training on large corpora of annotated real-world data has proven crucial for creating robust solutions to perceptual problems such as speech recognition and face detection. But the powerful tools used during training of such systems are typically stripped away at deployment. Ideally they should remain, particularly for unstable tasks such as object detection, where the set of objects needed in a task tomorrow might be different from the set of objects needed today. The key limiting factor is access to training data, but as this thesis shows, that need not be a problem on a robotic platform that can actively probe its environment, and carry out experiments to resolve ambiguity. This work is an instance of a general approach to learning a new perceptual judgment: find special situations in which the perceptual judgment is easy and study these situations to find correlated features that can be observed more generally.
Resumo:
We contribute a quantitative and systematic model to capture etch non-uniformity in deep reactive ion etch of microelectromechanical systems (MEMS) devices. Deep reactive ion etch is commonly used in MEMS fabrication where high-aspect ratio features are to be produced in silicon. It is typical for many supposedly identical devices, perhaps of diameter 10 mm, to be etched simultaneously into one silicon wafer of diameter 150 mm. Etch non-uniformity depends on uneven distributions of ion and neutral species at the wafer level, and on local consumption of those species at the device, or die, level. An ion–neutral synergism model is constructed from data obtained from etching several layouts of differing pattern opening densities. Such a model is used to predict wafer-level variation with an r.m.s. error below 3%. This model is combined with a die-level model, which we have reported previously, on a MEMS layout. The two-level model is shown to enable prediction of both within-die and wafer-scale etch rate variation for arbitrary wafer loadings.
Resumo:
En 'L’estudiant com a focus de la classe d’espanyol com a llengua estrangera' vull presentar com en les meves classes d’Espanyol 232, l’estudiant és el focus i l’eix de la classe. Més concretament m’adreçaré a les següents preguntes: quin tipus de metodologia empro per aconseguir aquest objectiu, quines són les expectatives del professor cap a l’estudiant, quins són els objectius de la classe i com s’aconsegueixen, i quines eines s’empren perquè l’estudiant aconsegueixi el nivell de competència requerit per satisfer el requisit de llengua de la universitat. També inclouré dues o tres lliçons i una varietat d’activitats interactives com a exemples. Per aconseguir aquest equilibri i per maximitzar l’experiència d’aprenentatge en l’estudiant el tipus de metodologia és important, a la vegada que també ho és establir uns objectius clars des del primer dia de classe on l’estudiant assumeix el seu paper actiu dins la classe i on les expectatives de la classe són detalladament explicades
Resumo:
Globalization has led to a drastic change on the international trade, this has caused that many countries such as France and Colombia find new business partners. This is the raison because I develop the investigation about How these countries can be integrated commercially in the dairy sector? Colombia and France had active trade relations for more than 8 years, but this has been declining for several factors. In the case of France, it has found that Germany is an excellent producer of raw milk, which can supply domestic demand and can export large quantities with a low price to European countries because of its proximity. For this reason, Colombia is a country that can not compete with Germany directly in raw milk, but I concluded that Colombia could become a major competitor with organic milk. This is the result of a research of the way they raise and feed the cows in the two countries, Colombia is generally on open country and Germany is on cow housing. In Colombia, the country found that Venezuela could offer many benefits in the processing of raw milk, and other processes that require high technology. But today Colombia has had several disputes with Venezuela and many Colombian companies are affected in this sector. Additionally, France is one of the largest producers and distributors of processed milk and its derivatives; it launches annually many manufactured products with high technology. At this point, Colombia can take out the best advantage because it can create an strategic alliance with French companies to bring most innovative and processed products such as cheese and yoghurts. The theoretical framework of this thesis consists on the analysis of competitiveness, because it is relevant to see if these two countries are competitive or if one has the comparative advantage over the other. The related authors are: Michael Porter, Adam Smith and David Ricardo. To complete the theoretical part, we found that France has a comparative advantage over Colombia in this sector thanks to its high technology, and Colombia is not currently competitive in the area of raw milk because the milk price is higher and the distance is a barrier, and it has no technology. But this research indicates that Colombia could become a competitive country selling organic milk.