Search results for: Latest H19-461_V1.0 Exam Questions Vce 🏯 H19-461_V1.0 Labs 🐒 H19-461_V1.0 New Study Plan 🙊 Search for ✔ H19-461_V1.0 ️✔️ on 「 www.pdfvce.com 」 immediately to obtain a free download 🧎H19-461_V1.0 Best Vce
-
Ingenuity And Intelligent Risk Assessment For Resilient Geotechnics
Geotechnical engineering is a risky business and there is much that can, and does, go wrong. It is often said that the single most common cause of failure in construction (including delays and additional costs) is in the ground. This would indicate that the logical path to design more resilient infrastructure would be the adoption of over-conservative designs. However, the in-ground structures that collapse often have a number of fundamental and basic design flaws. In reality, most in-ground structures move considerably less than predicted at design stage, suggesting that they were, in fact, over-designed. Over-design can also be considered a form of failure as it can add cost and delays in construction. It is generally accepted that a resilient piece of infrastructure is not necessarily one that does not fail upon a catastrophic event, i.e. one that is overdesigned to withstand such event. Otherwise the concepts of sustainability and resiliency would be conflicting. A resilient design is one that does not cause significant disruption to the community and can function effectively as quickly as possible after the catastrophic event.
So, how can geotechnical engineers achieve resilient infrastructure designs? The best approach seems to be associated with intelligent risk assessment that is based on an in-depth understanding of how such a design will perform before, during and after a catastrophic event. This approach requires good knowledge of the fundamental principles of geotechnical engineering such as solid mechanics, geology, failure mechanisms and so on. This paper will discuss some of the requirements for intelligent risk assessment and presents a practical example of an approach that could be adopted for the design of resilient infrastructures. Its primary focus is on the anticipated performance during a potential failure and the intelligent risk assessment forming the basis of the entire design.
-
Sampling Disturbance in Soft Ground: Implications in Geotechnical Design
This paper discusses the main mechanisms of sampling disturbance in soft ground and their implications in geotechnical design. In addition to the mechanical disturbance associated with the type of sampler, the influence of other factors frequently overlooked in practice such as thermal loading due to waxing as well as biological effects due to long-term storage, are evaluated. The paper discusses three methodologies for assessing sample quality in soft soils, which may be easily incorporated in practice. In the last section, mechanical soil properties derived from specimens retrieved using four different samplers are used in the prediction of the total settlement and excess pore water pressure underneath an embankment. The results of the prediction exercise demonstrate the negative effects of poor sampling in geotechnical design. It demonstrated that reliable and cost-effective predictions of geotechnical infrastructure are possible with minimum improvements to the current practice for sampling and testing of soft clays.
-
Compatibility Of Some Compacted Victorian Soils With Organic Chemicals And Waste Leachates
Compacted soil liners are used as hydraulic barriers in waste containment facilities to reduce the rate of pollution migration from the waste into ground water. It is imperative for these liners to have low hydraulic conductivity, preferably less than 10 −9 m/s, over the design time span of these facilities. During the assessment of a particular soil for liner construction, laboratory hydraulic conductivity tests with water are undertaken to examine whether the field compacted soil can achieve hydraulic conductivities less than the above value. However, in waste containment facilities, compacted soil liners come in contact with various chemical leachates, and it is possible that increases in hydraulic conductivity may result when compacted soils are permeated with some chemical liquids commonly found in leachates. It follows then that tests should be undertaken to assess the compatibility of these chemicals with the selected soil. It is generally considered that hydraulic conductivity tests with chemical leachate would provide direct evidence of the soil compatibility and is commonly undertaken as part of the design process of the facility.
The current paper covers the results of a study on the assessment of the compatibility of two Victorian soils using laboratory hydraulic conductivity tests. Hydraulic conductivity was measured with time permeating with water, leachate, methanol, and modified leachate using flexible wall permeameters (FWPs), consolidation cell permeameters (CCPs) and compaction mould permeameters (CMPs). The study indicates that the compatibility results are heavily dependent on the test method, and the commonly adopted flexible wall method does not usually produce significant change in the hydraulic conductivity whereas CCPs and CMPs produce considerable increases in hydraulic conductivity. The necessity for a better testing method is highlighted on the basis of these results. A new testing technique of zero lateral strain boundary condition was adopted and the results are presented for comparison.
-
Confined and partially confined swelling pressure of basaltic clays
This paper looks at the swell pressures developed in basaltic clay under confined and partially confined conditions, including the pressure reduction with expansion of the soil. The paper makes a comparison of the measured values with those presented in literature for similar materials and looks at the suitability of empirical equations from literature for the estimation of confined swell pressure in Melbourne’s basaltic clays. The paper also considers the potential applications for the findings in design and relevant considerations with regard to design in accordance with Australian Standards.
-
Data Mining For Geotechnical And Mining Engineers
It is now quite customary to arrange any set of data in a computer spreadsheet where the rows represent different cases or tests and the columns represent values of the parameters measured or calculated for each case. However, in some cases, we find it very hard, or even impossible, to find patterns or models for a set of data. Usually the degree of difficulty increases with both the number of data parameters (number of columns) and the number of data points (number of rows). One is quite likely to face or deal with a data, which is somehow or somewhere biased, deficient or inaccurate. This is usually a most disturbing difficulty, as any model incorporating such deficiencies or inaccuracies contains noise. Such models cannot be useful until they pass through an appropriate filtering process. But, most often even this judgmental recognition of the bad data is also unknown and needs to be discovered, investigated and justified.
Successful models are the results of good, reliable and accurate data. For example, suppose we haven’t discovered the well-known relations V = IR and P = RI² in electricity, but, after some experiments, we have measured the currents: I = 10, 20, 30A, the resistances: R = 5, 10, 15Ω, the voltages: V = 50, 200, 450V, and the powers: P = 0.5, 4 and 13.5 kW. From this “good” data, one can easily derive the relationship between the power (P) and the resistance (R) and the current (I) leading to the well-established equation: P = RI², although the three parameters (I, R, V) are not even independent and are potential sources of noise creation if the data were inaccurate, biased or disturbed. Now, it is left to the reader to try to disturb this data in any wishful fashion to experience the extreme difficulties involved in rediscovering the above relationships, which are even not unknown to us any more.
The problem is that most often we do not know how good or bad or efficient or deficient is our data. These are perhaps sufficient reasons to remind us that any experimental data need to be validated professionally before it can be used in data mining for a predictive modelling process.
Data mining is the process of extracting a category, pattern, or model from an existing data for predicting either another existing data or a non-existing data. Linear regression, i.e. fitting a simple line to a set of data, is the simplest data mining method. In this case there is no interactions between the independent parameters. Any curvature in the line is a sign of either non-linearity or interactions among various parameters. Traditional statistical regression models are not appropriate for discrete, descriptive, or item-based data. For example consider data points (e.g. percentage of the time for selling one shopping item (dairy) versus that of other items (meat) together in a supermarket) distributed as two distinctive regions or clusters in a x-y coordinate system. In this case, a classical mathematical regression function cannot represent such discontinuous sets or clustering behaviour. For discontinuous or item data, methods of cluster analysis and decision trees are the two common techniques used to form subsets, groups or categories with common behaviour/properties.
There are two categories for data mining, namely the inferential and the non-inferential techniques. Hypothesis testing and inference from sample to population are the main features or framework of the inferential data mining. Inferential techniques have their foundations on the traditional statistical theories. Discriminant analysis or group regression, linear regression, analysis of variance, logistic and multinomial (categorical) regression and time series analysis, all belong to this category. The key difference between inferential and non-inferential techniques is in whether hypotheses need to be specified before hand. In the non-inferential methods this is not normally required, as each is semi or completely automated as it searches for a model. Cluster analysis, market basket/association analysis, link analysis, memory based reasoning, decision trees and neural networks, all belong to non-inferential techniques. There is no predefined outcome in either of the cluster analysis, decision trees and market basket techniques, as all three use categorical or continuous predictors to create cluster, tree or basket (association) memberships for various data points. Linkage is created between sets of items in the link analysis, while all types of data (including text) can be entered in the memory-based reasoning technique to predict an outcome. Amongst all these, neural network is one of the most popular and powerful techniques that can use both categorical and continuous predictors to predict categorical and continuous outcome variables. The reader may refer to Berry et al (1997) for more information on all these models.
Inferential and neural networks are perhaps the most relevant techniques to almost all engineering disciplines. Skipping the inferential methods (Alehossein and Hood, 1996), we elaborate on the neural networks.
-
A prototype of a self-assembly slope-stability scanner for small slopes
Landslide is a frequent hazard, which causes thousands of fatalities annually worldwide. However, the incidents are predictable if there is appropriate monitoring equipment. In general, slopes, especially rock slopes, usually have a few minor movements before they completely slide down. The signals occur from a few seconds to a few weeks before any noticeable failures. Currently, there are plenty of slope monitoring devices, but they are expensive, complex, and specialised for large-scale mass wasting. There is a need for affordable devices for both domestic and public use. This paper summarises the challenges involved in the design of a low-cost self-assembly slope-stability monitoring device. The device uses the Light Detection and Ranging technology, which sends a light beam to a slope and captures the reflection. Based on the travel time, the distance from the device to a point on the slope is calculated. The continuous measurement detects any imminent slope movement. The device is accurate, durable, and can operate under different working conditions with a high degree of reliability.
-
Problems in testing of carbonate sediments
Carbonate sediments are formed in marine environments, in the tropical and sub-tropical climate belts around the world, such as southern Africa, India, Indonesia, Brazil and Australia. These sediments are characterised by their high crushability potential and variability in composition, grain shape, fabric and mineralogy. The design of foundations for offshore structures to be installed in these areas requires engineering parameters, which are generally determined using offshore site investigation data combined with onshore laboratory tests. The reliability and robustness of the design criteria are heavily reliant on the accuracy of the field and laboratory data. The field testing methodologies are generally well understood and can be verified using the recovered samples. However, conventional laboratory testing procedures are generally inadequate for testing carbonate sediments. The results obtained using the standard testing procedures may result in the derivation of a wide range of engineering parameters, which could result in costly design of structures and in some cases may jeopardise the development of the field.
An audit was carried out by Advanced Geomechanics (AG) as part of their QA process. The audit was undertaken in two parts. The first part consisted of testing seven material types, three non-carbonate (Silica Sand, Silica Flour and Kaolin Clay) and four carbonate materials (one terrestrial and three offshore) at four different laboratories. The identity of the tested samples was kept from the laboratories (blind tests). The tests requested included classification, permeability and consolidation. The second part of the audit was to investigate the effect of the operator on the test results, which is currently being carried out at AG’s laboratory (agLAB) and will be reported in a separate paper.