Power Community

Power Community

Welcome to the Dataverse Community

Welcome to the latest resources for Dynamics 365 including Customer Engagement, Finance & Operations and Business Central. Every month we deliver free community bootcamp's as well as Microsoft Certification Training Courses. Power Community is the perfect place to launch your career in Dynamics 365 and lead successful digital transformation projects with Microsoft Technologies.

- Advertisement -spot_img


Power Platform Developer Tools monthly release update (August Refresh)

Last month we released a lot of new features including the Azure DevOps tasks based on the Power Platform CLI. The response from the community has been nothing short of amazing, some of you have reported improved performance in your pipelines. Going forward, we will also be coming out with scenario driven blogs to show how to use those commands based on a given scenario, these blogs will be outside the regular announcement blogs like this one. So, without further ado, let us get started with some of the new features coming out this release. New!! Preview: Pac Connector command Now introducing the new connector command in the Power Platform CLI. This Power Platform version works against Dataverse. You will now be able to create custom connectors as a code first component directly into Dataverse. If you have custom connectors already created and the pac connector list command does not show it in its output, it is because the custom connector is not in Dataverse. To put it Dataverse, you need just add the connector to a solution. New!! Certificate AUTH support for Service Principal From this release onwards you can use certificate-based authentication for Service Principal accounts. This was a widely asked feature from our DevOps users. This is now available. New!! Assign-user operation can now assign an application ID When we released the assign user capability last month, we got your feedback loud and clear. Some of you have scenarios where the environment is not owned by a user but by another service principal who is different from the service principal account that is running a pipeline. This kind of a situation occurs when the environment in production is not to be owned by a single user account but by a service principal. I am glad to report that from this release onwards, including in Azure DevOps and GitHub actions, the assign user can now accept an application user (service principal) New!! Data import capability in Power Platform CLI For a long time, our users have been asking to get the Data migration tool into the Power Platform CLI. Now you can use the data import capability instead of the data migration tool. Just as move solutions across environments, the corresponding data for such solutions that use their data from Dataverse, we can now move them too. We have introduced the option to export and import and the data can be in a zip file or in a directiory Moving towards JUST –ENVIRONMENT no more URL or Envid, provide the one that suits with a single parameter In most of the commands recently you would have noticed that we are removing references to –url or –environmentid to just –environment. So, whether you pass the url or the id, the command will resolve itself from the information provided on whether the parameter passed is a url or an id. Keeping with that model, we have updated the pac application list and install commands with this capability now And other commands like solution list and connector list and many more are following a similar path. Going forward all you need to do is provide the –env flag and pass either the url of the environment or the id of the environment and the system will figure out the right one for you. Just added… the ability to provide runtime settings for packages During package deployment, although the API allows you to submit runtime settings for the package, such which solution to run first etc. We had unfortunately not exposed that, now we can do that with this release, thereby making a lot of folks who deploy power platform packages happy. And in addition, for plugin developers, you can now opt-in for strong name signing for your plugin library. As you can see, we have introduced a number of new capabilities in this update. As always we are looking forward to your feedback on these new capabilities. The forums are the same for feedback, please reach out to us via the following forums ISVFeedback@Microsoft.com or The PowerUsers community. Raise the issue and bugs at the following location in GitHub https://aka.ms/powerplatform-vscode
- Advertisement -spot_img

Latest News

3 Primary Reasons to Learn Dataverse

Audrie Gordon, Solution Architect, Power CAT, Tuesday, September 6, 2022 If you’ve been on the fence about learning Microsoft Dataverse for your Power Platform solutions, then this blog post is for you. We will explore the most proven motivators for learning, as well as using, Dataverse. We’ll cover three topics: Data Stewardship, Security and Integration. The most inspiring reason for learning Dataverse is the breadth of capabilities you can use to set strong standards for data stewardship. As per Wikipedia: “Data Stewardship means the formalization of accountability over the management of data, and the data-related resources. So, while data governance programs set the rules, data stewardship oversees the smooth implementation of those rules.” Therefore, data stewards seek a comprehensive approach to data management to ensure the quality, integrity, accessibility and security of the data. How does Dataverse help? Dataverse is designed to be more than just a database. It can also include data from other data sources, for example through virtual tables. Everyone can take advantage of the built-in Common Data Model (CDM) tables designed to support strong communications between you and the businesses you partner with. Of course, you can also easily create custom tables, views, and forms. But don’t stop there! Take the data model to the next level by layering business logic, rules, and process flows to maintain data integrity and guide participants through important process steps or milestones. This end-to-end approach of optimizing enterprise data models enables both solution makers and business analysts to use, and to share data, with confidence. Get started right away extending the value of tables and leveraging business rules with the new “Formula Fx Column“. The Formula Fx Column enables us to use Excel-like expressions within table columns (aka Power Fx). In the solution below, I’ve added a column to check when the current contract value exceeds 30% of the original bid value. Now Dataverse will trigger process alerts through Power Automate when this occurs. Since the logic is built into the data, all the Makers (new and old) will automatically gain this logic when they build apps using this table; no matter what type of app they build! TIP: Leverage Formula Fx columns to drive consistency in process and notifications Microsoft consistently prioritizes security and customer trust. Anyone can learn more about Microsoft’s Security and Trust commitment, or specifically about security and compliance for the Power Platform. So, why do I call this out as one of the three reasons for learning Dataverse? Because Dataverse brings security to the next level with its scenario-focused approach that facilitates a wealth of data visibility, security, compliance and auditing. Dataverse manages these through several layers of controls making it a platform with security on steroids. Some of the most commonly used layers of security include, but are not limited to: The Environment: The environment itself is the root container in the tenant for Dataverse. So it’s easy to assign a Security Group (“SG”) to an environment. This will constrain the environment contents (such as database tables) to members of that SG. Column-Level Security: Each column within a record can be configured for column-level security. Now we can decide to share all Customer Account details with the Sales team, but restrict access related to contract value and invoicing to only the Finance team. Role-Based Security: Dataverse uses role-based security (RBS) to group together a collection of privileges. These security roles can be associated directly to users, or they can be associated with Dataverse teams and business units. Users can then be associated with the team, and therefore all users associated with the team will benefit from the role. My favorite thing about this is that you can create roles and then insert them into the Solution Package so they can easily be reused there or in other solutions. This layered approach to security and record visibility supports the diversity of requirements needed for common business scenarios. It’s not just about ‘who can access what data’, it’s also about facilitating need-to-know visibility by combining layered security with filters and views to aid in discovery, yet reduce noise in a people-friendly methodology. Note the roles and the people in the image below. They are all working on a construction project but they have varied data access constraints and requirements. Dataverse can ensure that each individual gets to what they need to know, when they need to know it. TIP: Simplify, and reuse, Security Roles by storing them in Power Platform Solutions: Not all of our data starts or ends in Dataverse. Dataverse is designed to help you to orchestrate all your enterprise data needs, no matter where that data is stored. In many cases we will want to migrate, synchronize with, or simply just view data virtually within Dataverse. Both migration and synchronization occur easily using Dataflows. Dataflows are a self-service, cloud-based, data preparation technology. Dataflows enable customers to ingest, transform, and load data into Dataverse environments, Power BI workspaces, or your organization’s Azure Data Lake Storage account. Customers can trigger dataflows to run either on demand or automatically on a schedule; data is always kept up to date. But wait, you don’t always have to move or synchronize data! In many cases, having a “virtual” table is the best choice for leveraging the data directly from the source. Any business user can create virtual connections to data external to Dataverse. Check out the new virtual table providers for SQL, SharePoint, and Excel for example. Thanks to virtual table providers, we can now take advantage of data outside of Dataverse to layer it into our solutions, or to enable more complex scenarios that require modern technologies such as Artificial Intelligence (AI), Machine Learning (ML), Internet of Things (IoT), Azure functions, extended compute power, and/or dynamic query-driven tables. TIP: Optimize solutions by layering data sources that you rely on every day using virtual tables My favorite service integrations are those related to optimizing Business Intelligence (BI) insights (such as with Power BI), and the wide selection of Azure service partnerships, such as with Azure Synapse. Azure Synapse extends both compute power and our ability to create dynamic table queries through the use of Spark or SQL select statements. Both existing Dataverse tables and query tables created in Azure Synapse, provide creative opportunities for visualizations and insights in Power BI (learn more in this demo). TIP: Take advantage of Spark and SQL Select statements along with the enhanced compute power of Azure Synapse Conclusion There is definitely a broad return on investment when it comes to learning Dataverse. Data Stewardship to Scale – helps us to reuse data and set standards across our business Granular Security – secures data across the domain, across tables, including column and role based security Entended Integration Powers – enables us to bring data from anywhere we need it into our solutions Get started learning today: Microsoft Learn for Dataverse Additional resources: Security Concepts Power FX and Business Rules Virtual Table Connector Providers Azure Synapse Link for Dataverse

Connected Community



Free Training Courses

- Advertisement - Advertisement
- Advertisement -spot_img

Video Channel

- Advertisement -spot_img