Optimizing mining projects: how business intelligence enhances data-driven decisions Autores João Victor Valle Mazzaro IFMG - Campus Congonhas Natalia Fernanda Santos Pereira IFMG Campus Congonhas Sinval Pedroso da Silva IFMG - Campus Congonhas https://lnkd.in/dmUf7gqt DOI: https://lnkd.in/dF6mhjbq Palavras-chave: Power BI, Data processing speed, Project management processes, software application, Real-time analytics Resumo The project management sector grapples with the challenge of handling a vast amount of information. Data compilation is essential to support effective decision-making processes by project managers. This paper emphasizes the significance of both project planning and execution monitoring. It concurrently explores the applicability of Business Intelligence (BI) software and its contributions to enhancing project management. The methodology employed a literature review combined with the authors' expertise in engineering projects. The findings reveal that BI offers substantial benefits. These include reducing the time required to transform large datasets into clear and concise graphical information that can be readily presented and dynamically adjusted to meet specific user needs. A study conducted in a mining company demonstrated a significant reduction in project management activities. By utilizing Power BI software, they achieved a 50% reduction in total time dedicated to project management activities. Additionally, there was a remarkable 83% reduction in the time spent on report issuance and updates. This outcome enables that project team members are aligned with consolidated information, facilitating real-time decision-making and fostering greater integration and productivity.
BJPE Brazilian Journal of Prod Engineering’s Post
More Relevant Posts
-
Mining Optimization with Advanced Software: ⛏️💻 Optimizing day-to-day operations in the fast-paced mining world is crucial for maximizing efficiency and profitability. Using advanced mining software like GEOVIA Surpac, MineSched, and Whittle can significantly enhance the decision-making process. 📊💻 🔍 Geological Data Integration is the first step in optimization. Daily, geologists collect and input data into Surpac, continuously updating 3D geological models. This ensures that the most accurate resource estimates are used in planning. 🛠️📊 📅 Weekly Pit Optimization helps adapt to changing market conditions by recalculating the most economically viable sections to mine with software like Whittle. This dynamic approach ensures long-term profitability. 💰⛏️ 📈 Detailed Short-Term Schedules are critical in daily operations and can be provided by MineSched software. This software helps allocate equipment and personnel efficiently, maximizing output and minimizing downtime. ⚙️👷♂️👷♀️ 🔄 Real-Time Production Monitoring ensures that operations stay on track. Any deviations from the plan are addressed, thanks to the feedback loop established by integrating real-time data into these platforms. 📡📈 🔧 Monthly Performance Reviews, supported by what-if scenarios in Whittle and MineSched, drive continuous improvement. This approach to mine optimization ensures that operations remain agile and responsive to both internal and external factors. 🔍⚒️ By applying these tools, mining operations can achieve higher efficiency, cost control, and operational excellence. 🌟🏆
To view or add a comment, sign in
-
-
💥 Organizational Process Mining research is advancing 💥 I am happy to announce that we recently published two papers to advance the organizational / managerial perspective on process mining in the BISE Journal. Both papers are available open access and are looking forward to being read... 😊 1️⃣ The first paper explores how organizations can navigate the design space of organizational process mining setups such as Centers of Excellence (CoEs). To that end, we propose a taxonomy of design dimensions along with many exemplary instantiations. The paper is a further developed version of a large-scale exploratory study we conducted together Celonis. Thank you, Laura Marcus, Sebastian Schmid, Franziska Friedrich, Philipp Grindemann for driving this initiative. The paper can be found here: https://lnkd.in/e2rK8vsF 2️⃣ The second paper presents a method for identifying, prioritizing, and monitoring portfolios of process mining use cases in larger organizations, which is essential for scaling process mining and realizing related value. The paper benefits from a real-world case study conducted at Infineon Technologies. Thank you, Dominik Fischer and Laura Marcus for pushing this forward. The paper can be found here: https://lnkd.in/eGquqTNp Of course, there is more to come from FIM Forschungsinstitut für Informationsmanagement and Fraunhofer-Institut für Angewandte Informationstechnik FIT, including research on fuck-up stories for process mining, CoE journeys over time, the convergence of process mining CoE with BPM, Analytics, and Automation CoEs... What organizational process mining need more applied research in your opinion? Happy to get your input. Universität Bayreuth Fraunhofer Center for Process Intelligence
To view or add a comment, sign in
-
"Process Mining". Been there and done that, and I instantly liked this concise term for my multi-pronged approach to operations improvement. However, like all buzzy "new to us" business terminology, we need a common definition. The definition in the link below is a good start, but I find it a little narrow based on my personal experiences. If you would prefer to leave the definition in the link intact, let's just say that I would pair their approach with a couple of other activities, They introduce process mining as analysis based on passive data collection. (Transaction logs and the process "digital footprint"). This massive data autopsy approach, they say, gives a better and sometimes surprising view of actual process than the vision created in process workshops. While I have gleaned a lot of valuable process information with a read-only look inside the SQL back tier of enterprise software systems, I have found that you must also take a couple of before and after "Walkabouts" to where the front-line work is actually done, and even actively collect some exploratory/validation data "offline" from the corporate systems. A data dive needs to be paired with a reality check. This helps to build a complete picture of a process "as is" and then later, "as reinvented". A call center I know "improved" their wait time statistics reported from their computer logs literally overnight. That "improvement" lasted until someone noticed an incoming data trunk line had been disconnected. Their system was unaware of all the "busy/can't connect" calls that were lost. It only recorded that call volume was down and calls were answered more rapidly. -Don't- just swim in the data lake. Be sure to ask, what is happening in the real world? Where do the numbers come from? -Do- audit the chain of logic from real-world human observation, to automation & measurement, and finally to summation in a corporate database. Find a popular definition of "process mining" here: https://lnkd.in/eicXU6U2
To view or add a comment, sign in
-
Scoping Study: Essential Assessment for Mining Project Feasibility A Scoping Study is an early-stage assessment that evaluates the technical and economic feasibility of a mining project. Typically conducted during the reconnaissance or prospecting phase, it helps determine if further exploration and investment are warranted or if the project should be abandoned. It relies on preliminary data and assumptions, with an accuracy range of 40%-50%. Key Technical Parameters: 1. Mineral Resources & Reserves: Initial resource estimates based on geological surveys, surface sampling, drilling, and geophysical data. Typically classified as inferred or indicated Preliminary ore grade and tonnage estimates based on geological models and analogs from similar deposits. 2. Mining Method: Selection of the most feasible mining method (open-pit or underground) based on deposit geometry and depth. Estimation of stripping ratio for ore-to-waste assessment. 3. Metallurgical Testing & Recovery: Assumed recovery rates from preliminary testwork or industry standards (e.g., flotation, heap leaching). High-level processing flow assumptions, including plant configuration and throughput. 4. CAPEX/OPEX: Preliminary capital expenditure (CAPEX) estimates for mine development, infrastructure, and plant construction. Operational expenditure (OPEX) estimates for mining, processing, waste management, and logistics. 5. Infrastructure & Logistics: Assessment of required infrastructure such as roads, water, and power. Evaluation of ore transport options and the need for new infrastructure. 6. Environmental & Social Considerations (ESG): Preliminary identification of environmental impacts and mitigation strategies. Early evaluation of permitting and regulatory requirements, as well as community engagement. 7. Economic & Financial Modeling: Early-stage financial modeling based on commodity prices, ore grades, and recovery rates. Sensitivity analysis of cash flows to commodity price and cost fluctuations. 8. Risk Assessment: Technical risks related to ore body variability, mining assumptions, and recovery rates. Financial risks from economic factors such as price volatility and cost escalations. Outcome: The Scoping Study produces a conceptual project model, guiding the go/no-go decision on whether the project should proceed to more detailed feasibility studies or be abandoned. Purpose: It provides a high-level feasibility assessment to determine if further detailed exploration or studies are justified. Conclusion: A Scoping Study provides an essential preliminary evaluation of a mining project’s technical and economic feasibility, offering initial estimates with a 40%-50% accuracy range. It informs decisions on whether to advance to more detailed studies or halt further investment, based on key parameters such as resource estimates, mining methods, CAPEX/OPEX, and environmental considerations. #Geology #ScopingStudy #FeasibilityStudy #MineralResources #MiningMethods
To view or add a comment, sign in
-
-
Revolutionizing Mining Engineering Education: Integrating Advanced Financial Management for Future Industry Leaders In today's rapidly evolving mining industry, the traditional approach to engineering education requires a significant transformation. While technical expertise in mineral extraction, processing, and operational management remains fundamental, there's an increasingly critical need to incorporate sophisticated financial management principles into mining engineering curricula. This integration is not merely an academic exercise—it's a response to the industry's complex reality where billion-dollar investments intersect with volatile commodity markets and evolving geopolitical landscapes. The mining sector stands apart from many other industries due to its unique combination of capital intensity, extended development timelines, and exposure to multiple risk factors. Modern mining projects regularly demand investments exceeding several billion dollars, with development phases that can stretch beyond a decade before the first ounce of mineral is produced. This financial magnitude, coupled with the inherent uncertainties in mineral exploration and extraction, creates an environment where sophisticated financial understanding becomes as crucial as technical expertise. Real options analysis emerges as a particularly vital tool in the mining engineer's financial toolkit. Unlike traditional net present value calculations, real options thinking enables engineers to quantify and value the flexibility inherent in mining operations. This approach becomes invaluable when considering operational decisions such as temporary mine closures during price downturns, phased expansion strategies, or the optimal timing of project development. Mining engineers equipped with real options analysis skills can better articulate the value of maintaining operational flexibility to stakeholders and make more informed decisions about project timing and scale. Financial option pricing theory, while traditionally associated with securities markets, has found powerful applications in mining project evaluation. Understanding option pricing mechanisms helps engineers develop more sophisticated approaches to valuing mineral rights, structuring project financing, and implementing effective hedging strategies. This knowledge becomes particularly crucial when negotiating joint ventures, evaluating exploration licenses, or making decisions about project development timing. Engineers who understand these principles can better protect their projects against adverse market movements while maintaining upside potential. The integration of advanced risk management techniques represents another critical area for mining engineering education...
To view or add a comment, sign in
-
Comprehensive Overview of the Mining Business Case Project Matrix The successful development and execution of a mining project require a rigorous evaluation across multiple key parameters. The Mining Business Case Project Matrix serves as a strategic framework for assessing all critical aspects—from initial exploration through to full-scale production. By systematically addressing these factors, industry professionals can mitigate risks, optimize resource allocation, and ensure the sustainability and financial viability of mining operations. 1️⃣ Project Stage: Define the project's lifecycle phase—exploration, development, or production. 2️⃣ Location: Assess the geographic setting and logistical access. 3️⃣ History: Review historical data on past exploration and mining activities. 4️⃣ Corporate Structure: Examine the organization behind the project, including ownership and governance. 5️⃣ Development Team: Evaluate the expertise of professionals managing the project. 6️⃣ Permits and Licenses: Ensure all regulatory approvals are in place for compliance. 7️⃣ Geology: Analyze the deposit type, formation, and mineralogical properties. 8️⃣ Exploration: Investigate techniques and results of drilling, sampling, and geophysical surveys. 9️⃣ Resources and Reserves: Quantify the deposit as per JORC, NI 43-101, or similar standards. 🔟 Mining: Outline the extraction methods, mining technology, and recovery rates. 1️⃣1️⃣ Mine Equipment: Specify machinery and equipment for efficient operations. 1️⃣2️⃣ Processing: Detail beneficiation techniques and processing plant design. 1️⃣3️⃣ Production Parameters: Highlight planned throughput, grade, and recovery rates. 1️⃣4️⃣ Infrastructure: Identify key infrastructure—power, water, and transport. 1️⃣5️⃣ Products: Define the final products and by-products. 1️⃣6️⃣ Markets: Assess market demand, pricing trends, and customer base. 1️⃣7️⃣ Logistics: Detail transportation routes, costs, and supply chain management. 1️⃣8️⃣ Human Resources: Evaluate workforce requirements and skill availability. 1️⃣9️⃣ Environmental: Assess impacts and mitigation strategies for sustainability. 2️⃣0️⃣ Society and Community: Consider local community engagement and benefits. 2️⃣1️⃣ Financial—CAPEX: Analyze initial capital expenditures. 2️⃣2️⃣ Financial—OPEX: Review ongoing operational costs. 2️⃣3️⃣ Financial Indicators: Measure NPV, IRR, and payback period for investment returns. 2️⃣4️⃣ Project Plan: Outline a clear roadmap for project execution. 2️⃣5️⃣ Valuation: Determine the project’s overall financial worth. 2️⃣6️⃣ Funding Requirements: Highlight capital needs and potential sources. 2️⃣7️⃣ Legal Agreements/Matters: Ensure robust legal frameworks are in place. 2️⃣8️⃣ Documents Available: Maintain a repository of project reports and data. Each element is integral to de-risking projects and ensuring long-term sustainability. #Geology #Mining #ResourceEstimation #MinePlanning #Exploration #ProjectManagement
To view or add a comment, sign in
-
-
How to Enable Activity Effort Times in Process Mining We are often asked to determine the time various activities took. Unfortunately, many times there are not timestamps to represent when an activity started and ended. Represented as a single point in time, activities such as "Approve Invoice" lose a lot of meaning. There are many ways to address the lack of time Our approach is to first create a list of all activities in the system in an OLAP table. We then use the following PQL to order them by their average location in the process: AVG(INDEX_ORDER_ACTIVITY(ACTIVITY_COLUMN())). This is helpful because it can be difficult to see this metric visually in the Process or Variant Explorers. Additionally, those components do not show all the data at once, but done this way in an OLAP table will accurately show the average for the whole dataset. Next, we like to create three sets of variables, numbered 1-3. These represent checkpoints in the process that are slightly more important than a common change on the order. For example, we have used this in order management with various checkpoint sets built from activities such as "Order Created", "Order Filled", "Order Shipped", and "Order Invoiced". Variables to create for each set of checkpoints: - Activity Set: comma separated list of all the different activities for this checkpoint. Usually filled from a button dropdown component. - FTE Cost: float value for the expected cost it takes to execute the checkpoint. - Expected SLA: float value for the amount of time needed to complete the checkpoint from the prior checkpoint. - Missed Penalty: float value for the amount of penalty in dollars incurred if the checkpoint was not met in the prior specified amount of time. Finally, we use the checkpoints to construct an analysis of all the different timing breakdowns. Examples include from "Process Start" to Checkpoint 1. Checkpoint 1 to Checkpoint 3. Amount of cost incurred by missing SLAs on Checkpoint 3. We've used this format successfully on several projects to accurately account for the time it takes to do certain activities. Additionally, you can use the other activities left in the data to represent "updates" and do root causing on the which updates caused checkpoints to be missed. This type of view is missing from vanilla process mining tools and can be especially helpful once it is built in. As you can see there are many ways to address the lack of time. Hope this example from ProcessMiningIQ is helpful in your analysis journey! Want these in your inbox? Let us know in the comments and we'll add you to the list. #processmining #Celonis
To view or add a comment, sign in
-
-
How to Enable Activity Effort Times in Process Mining We are often asked to determine the time various activities took. Unfortunately, many times there are not timestamps to represent when an activity started and ended. Represented as a single point in time, activities such as "Approve Invoice" lose a lot of meaning. There are many ways to address the lack of time Our approach is to first create a list of all activities in the system in an OLAP table. We then use the following PQL to order them by their average location in the process: AVG(INDEX_ORDER_ACTIVITY(ACTIVITY_COLUMN())). This is helpful because it can be difficult to see this metric visually in the Process or Variant Explorers. Additionally, those components do not show all the data at once, but done this way in an OLAP table will accurately show the average for the whole dataset. Next, we like to create three sets of variables, numbered 1-3. These represent checkpoints in the process that are slightly more important than a common change on the order. For example, we have used this in order management with various checkpoint sets built from activities such as "Order Created", "Order Filled", "Order Shipped", and "Order Invoiced". Variables to create for each set of checkpoints: - Activity Set: comma separated list of all the different activities for this checkpoint. Usually filled from a button dropdown component. - FTE Cost: float value for the expected cost it takes to execute the checkpoint. - Expected SLA: float value for the amount of time needed to complete the checkpoint from the prior checkpoint. - Missed Penalty: float value for the amount of penalty in dollars incurred if the checkpoint was not met in the prior specified amount of time. Finally, we use the checkpoints to construct an analysis of all the different timing breakdowns. Examples include from "Process Start" to Checkpoint 1. Checkpoint 1 to Checkpoint 3. Amount of cost incurred by missing SLAs on Checkpoint 3. We've used this format successfully on several projects to accurately account for the time it takes to do certain activities. Additionally, you can use the other activities left in the data to represent "updates" and do root causing on the which updates caused checkpoints to be missed. This type of view is missing from vanilla process mining tools and can be especially helpful once it is built in. As you can see there are many ways to address the lack of time. Hope this example from ProcessMiningIQ is helpful in your analysis journey! Subscribe for weekly tips: https://lnkd.in/gEZdzktu #processmining #Celonis
To view or add a comment, sign in
-
-
Mine-to-Mill Optimization: A Comprehensive Approach to Enhancing Mining Efficiency The Mine-to-Mill (M2M) methodology represents a strategic framework for developing integrated operational and control strategies that enhance the entire mining process, from extraction to processing. By maximizing throughput, minimizing costs per tonne, and ultimately boosting profitability, this methodology emphasizes a holistic approach rather than focusing on individual processes. Key components of the Mine-to-Mill methodology include: 1. Benchmarking: Assessing the performance of both mining and processing plants to identify areas for improvement. 2. Ore Characterization: Analysing ore properties to inform key decisions in drill and blast fragmentation and in the design of crushing and grinding circuits. 3. Measurements and Modelling: Utilizing data to create accurate simulations that reflect the behaviour of various processes. 4. Implementation: Putting the developed strategies into action to achieve optimal results. 5. Monitoring : Continuously tracking performance to refine and adapt strategies as needed. Over the past two decades, numerous consultants and research institutions have successfully implemented the Mine-to-Mill methodology, which aims to maximize the overall profitability of mining operations. These studies have demonstrated the interdependency of all processes within the Mine-to-Mill value chain. Specifically, the efficiency of downstream milling processes—such as crushing and grinding—can be significantly influenced by the characteristics of the upstream mining operations. Empirical results from various Mine-to-Mill projects have shown impressive increases in mill throughput, ranging from 5% to 30%, depending largely on the strength and comminution properties of the ore. It’s crucial to note that while the size of blasted material plays a critical role in the efficiency of crushing and grinding, the particle size achieved through grinding directly affects the recovery rates in subsequent separation processes. Therefore, understanding the trade-offs between achieving a finer grind size—with its associated increases in costs and energy consumption—versus the potential for enhanced liberation and recovery is vital. In addition to optimizing existing processes, the Mine-to-Mill methodology is also applicable to greenfield projects and expansions. This approach mandates rigorous data collection, which includes ore characterization, historical operational data, thorough drill-and-blast audits, surveys, and benchmarking results. From this data, site-specific mathematical models are developed for each process, encompassing blasting, crushing, and grinding, allowing for the simulation of various operational strategies tailored to different ore types. Mineser has been actively engaged in several Mine-to-Mill optimization projects, resulting in substantial increases in mill throughput for participating mines. # MINESER
To view or add a comment, sign in
-
-
How to Enable Activity Effort Times in Process Mining We are often asked to determine the time various activities took. Unfortunately, many times there are not timestamps to represent when an activity started and ended. Represented as a single point in time, activities such as "Approve Invoice" lose a lot of meaning. There are many ways to address the lack of time Our approach is to first create a list of all activities in the system in an OLAP table. We then use the following PQL to order them by their average location in the process: AVG(INDEX_ORDER_ACTIVITY(ACTIVITY_COLUMN())). This is helpful because it can be difficult to see this metric visually in the Process or Variant Explorers. Additionally, those components do not show all the data at once, but done this way in an OLAP table will accurately show the average for the whole dataset. Next, we like to create three sets of variables, numbered 1-3. These represent checkpoints in the process that are slightly more important than a common change on the order. For example, we have used this in order management with various checkpoint sets built from activities such as "Order Created", "Order Filled", "Order Shipped", and "Order Invoiced". Variables to create for each set of checkpoints: - Activity Set: comma separated list of all the different activities for this checkpoint. Usually filled from a button dropdown component. - FTE Cost: float value for the expected cost it takes to execute the checkpoint. - Expected SLA: float value for the amount of time needed to complete the checkpoint from the prior checkpoint. - Missed Penalty: float value for the amount of penalty in dollars incurred if the checkpoint was not met in the prior specified amount of time. Finally, we use the checkpoints to construct an analysis of all the different timing breakdowns. Examples include from "Process Start" to Checkpoint 1. Checkpoint 1 to Checkpoint 3. Amount of cost incurred by missing SLAs on Checkpoint 3. We've used this format successfully on several projects to accurately account for the time it takes to do certain activities. Additionally, you can use the other activities left in the data to represent "updates" and do root causing on the which updates caused checkpoints to be missed. This type of view is missing from vanilla process mining tools and can be especially helpful once it is built in. As you can see there are many ways to address the lack of time. Hope this example from ProcessMiningIQ is helpful in your analysis journey! Want these in your inbox? Let us know in the comments and we'll add you to the list. #processmining #Celonis
To view or add a comment, sign in