Power Community

Power Community

Model Driven Power Apps

Introducing Test Engine: An open platform for automated testing of canvas apps

Chris Baldwin, Principal Product Manager, Power Apps Studio, Wednesday, October 5, 2022 Testing is an important part of the software development life cycle. Adequate functional testing of applications helps to ensure that business processes stay unblocked, helps to reduce support costs, and helps to build trust in your applications. And as apps grow and become more complex, the ability to ensure that changes do not break or alter the app experience becomes more and more critical. Our mission is to empower all makers of Power Apps to easily create robust tests that can be seamlessly integrated into your organization’s application lifecycle management (ALM) practices. Introducing Test Engine We’re excited to announce the initial preview of Test Engine, an evolution of our Power Apps testing tools. Test Engine builds upon the key use cases of Test Studio, but takes it in a new, powerful direction through open source collaboration and use of the Playwright browser testing platform. The goals of Test Engine are to provide customers with a robust testing platform for all types of Power Apps, and to make it super easy to integrate automated testing into your app development processes. Today’s announcement is the first step towards those goals. The project is provided as an Open Source project on GitHub that, when built, will create a local executable that you can use to execute tests. This initial release supports the ability to author tests for Power Apps canvas applications. We plan to continue to iterate on this project, adding support for all types of Power Apps, as well as enhanced tooling to facilitate integration into your CI/CD systems like GitHub and Azure DevOps. For now, we welcome you to exercise this tool and give us feedback as we continue to build out the platform. Power Fx test authoring Power Fx is the low-code language for Power Platform, and that includes Test Engine. There is no need to write any C#, JavaScript, or any complex code to define your tests. Tests are defined in easy-to-read YAML files using the familiar Power Fx language. Using a few Power Fx functions that you already know from Test Studio like Select, Assert, SetProperty, and Index, you can define the sequence of steps your test should run. Here is an example of a Test Engine test case: Connector mocking Test Engine allows you to define mock responses for connectors.  This technique lets makers continue testing of Power Apps while isolating it from the remote APIs that the app connects to.  This can be useful your apps hit endpoints that have side effects, like inserting rows into Dataverse tables. Screenshot function and video recording Test Engine introduces the new Power Fx function Screenshot() that allows you to take a screenshot of the app state at any point during the test run.  For example, you may want to take a screenshot at the beginning and the end of your test suite to capture what the end user sees.  In this example, a screenshot is captured at the beginning of each test case, and upon completion of the test suite: Test Engine can also automatically record a video of the entire test run.  Simply set the recordVideo setting to true.  You can use the captured video to observe exactly what the end user is seeing during the test run.  This can be very useful when diagnosing and investigating failed tests. Only update tests when you update your app We have heard from customers that using traditional browser automation tools to test Power Apps apps is difficult to get right and frequently breaks, requiring lots of updates to tests when the maker hasn’t made any changes to the app.  This is because traditional testing methods require the test author to interact with the browser document object model (DOM) of the app as its UI is rendered in the web player.  With Test Engine, all of this is handled by the platform.  Your tests are written using Power Fx and referencing the names of the controls that you define at design-time.  You will only need to update your test plans if you make changes to the app itself. Coming Soon: Reuse tests recorded in Test Studio with Test Engine If you are a user of Test Studio and already have tests recorded using that tool, we are providing a way that you can reuse them directly in Test Engine.  The ability to download test suites from Test Studio will start rolling out to customer tenants towards the end of October 2022.  The downloaded test suite can be used in Test Engine without any modification to the test plan file.  Watch the Power Apps Studio release notes for information about when the rollout starts.  Next steps Visit Test Engine on GitHub and try it out. You can build the solution with a couple commands.  Once you’ve done that, the Test Engine repo has a library of sample test plans and solutions you can use to get started.  To use the samples, import the provided solutions into your tenant and run the corresponding test plan.  Since it’s open source, we welcome contributions to both code and documentation.  Let us know about any bugs, feedback, and feature requests using the Issues list within the GitHub project. Happy testing!

What’s new: Power Apps September 2022 Feature Update

Clay Wesener, Partner GPM, Power Apps Studio, Monday, October 3, 2022 Welcome to the Power Apps monthly feature update! We will use this blog to share a summary of product, community, and learning updates from throughout the month so you can access it in one easy place. A variety of new and highly anticipated features are now available which we are very excited to share. These updates are in addition to the announcements already shared the Microsoft Power Platform Conference in September including Cards for Power Apps, Coauthoring, and Maker matching. You can read more about them in Ryan Cunningham’s blog here, Power Apps brings collaboration to center stage with 3 big announcements | Microsoft Power Apps. Maker Productivity Power Apps supports the new SharePoint list image column type for both read and write Power Fx: Introducing Named Formulas Power Fx : ParseJSON functionManaging related tables in modern form designer  End User Productivity Cards for Power AppsEfficiently create performant Fluent UI based Power Apps with the Creator Kit  Maker Productivity Power Apps supports the new SharePoint list image column type for both read and write  We are happy to announce support SharePoint list image columns for both read and write scenarios.  It is currently being deployed and will be in all regions by Oct 7.  Previously we only supported reading images.  But, you can now create a Power App to read, create, update and delete SharePoint column images directly in a Power App.  You can also choose to get either the full version of the image or various thumbnail sizes:  Small, Medium, and Large.  Use specific thumbnail sizes for optimal performance. For example, use Small for a gallery or Medium for a form.  Large images might be used for detailed inspection.  Power Fx: Introducing Named Formulas  With Named Formulas, you can simplify your app’s initialization, reduce app load time, reuse logic, and improve the maintainability of your apps. Named Formulas derive from how this feature appears in Excel, in the “Name Manager.” In Excel, you can name any cell and refer to that name throughout the workbook. It adds a level of indirection that allows you to move that cell reference by name rather than by location. And inspiring the introduction into Power Fx, it also allows you to bind a name to a formula that isn’t in a cell. Today in Power Fx, you write formulas for the properties of the controls in your app. You can refer to those properties from other formulas to reuse the result of calculation. But you are limited by the controls and their properties. What if you could effectively create your own properties, create your own points of reuse? Learn more at Power Fx: Introducing Named Formulas | Microsoft Power Apps Power Fx : ParseJSON function The JSON format has seen widespread adoption in serializing application objects for systems communications and saving of state. Many services today offer RESTful services that communicate via JSON payloads, including services such as SharePoint and Azure DevOps. Power Apps provides a large amount of out of the box connectors, many of which talk to services via JSON and provide Power Fx types as input and output. But there are cases where either a service can provide very dynamic JSON payloads, or the data is provided as text but in practice contains JSON objects. In 2020 we released an experimental feature called Dynamic Schema which addresses specific scenarios such as custom fields on Azure DevOps work items. The standard connector for Azure DevOps can only know about standard Azure DevOps fields, but a “capture schema” feature allows a maker to have Power Apps capture the output of the connector call and adapt the schema based on what the output provides. A maker can subsequently work with the connector in Power Fx as if the fields were always part of the connector’s schema. This is a fairly static “update” to the schema that can be made when authoring the app. The experimental release of ParseJSON addresses the other end of the spectrum and provides an important base in the Power Fx language and runtime to bridge the areas in between. Learn more at Power Fx: Introducing ParseJSON | Microsoft Power Apps Managing related tables in modern form designer   Makers can now go ahead and manage related table navigation in the Related tab through modern form designer. With this feature, makers can navigate within the form to view a list of related table rows.   Learn more at Add relationships in a model-driven app form in Power Apps – Power Apps | Microsoft Learn End User Productivity Cards for Power Apps  A completely new way to design and deliver mini-apps directly inside Microsoft Teams using low code. By embedding essential micro-experiences directly in Teams, you can bring your conversation and workflow together like never before.  Learn more at What are cards? (Preview) – Power Apps | Microsoft Learn Efficiently create performant Fluent UI based Power Apps with the Creator Kit  The Creator Kit, published by the Microsoft Power CAT team, is a collection of 24+ Fluent UI controls and templates for Power Apps makers to create delightful app experiences more rapidly. All controls and components included in the kit use the Fluent UI framework to help easily create consistent, beautiful, and effective user experiences for custom business applications. We’ve gotten a lot of great feedback since initial release in May, and we announced the stable release of the Creator Kit in September, indicating the components are ready to be used in your production business applications.  The kit provides many improvements that will impact Power Platform makers, users, and admins:  Makers can focus efforts on building the problem-solving features of an application. The cohesive Fluent UI design makes it easier to make custom pages look more consistent and like model driven apps. You don’t need to be a front-end prodigy – or have budget for a design team – the kit will help you make stunning apps with the latest and most contemporary designs. Users will interact with the cohesive set of components that are intuitive and familiar (same controls used in all modern Microsoft applications). The components provide a performance boost which also provides a better user experience, which can help users be more productive while using your apps. Administrators who must govern UI consistency within an organization benefit from the modern theming architecture that Fluent UI provides inherently in the kit’s components. The components are developed and supported by dedicated engineering teams at Microsoft, so companies can deploy apps with Creator Kit components into production with confidence.  Learn more at Introducing the Creator Kit – Efficiently create performant Fluent UI based Power Apps | Microsoft Power Apps and Set up the Creator Kit – Power Platform | Microsoft Learn [embedded content] Learning and Doc Updates  We’ve also released new, and made updates to some of documentation – see a summary before of some of the key updates:  Please continue sending us your feedback on features you would like to see in Power Apps. We hope that you enjoy the update! 

Power Platform Developer Tools monthly release update (August Refresh)

Last month we released a lot of new features including the Azure DevOps tasks based on the Power Platform CLI. The response from the community has been nothing short of amazing, some of you have reported improved performance in your pipelines. Going forward, we will also be coming out with scenario driven blogs to show how to use those commands based on a given scenario, these blogs will be outside the regular announcement blogs like this one. So, without further ado, let us get started with some of the new features coming out this release. New!! Preview: Pac Connector command Now introducing the new connector command in the Power Platform CLI. This Power Platform version works against Dataverse. You will now be able to create custom connectors as a code first component directly into Dataverse. If you have custom connectors already created and the pac connector list command does not show it in its output, it is because the custom connector is not in Dataverse. To put it Dataverse, you need just add the connector to a solution. New!! Certificate AUTH support for Service Principal From this release onwards you can use certificate-based authentication for Service Principal accounts. This was a widely asked feature from our DevOps users. This is now available. New!! Assign-user operation can now assign an application ID When we released the assign user capability last month, we got your feedback loud and clear. Some of you have scenarios where the environment is not owned by a user but by another service principal who is different from the service principal account that is running a pipeline. This kind of a situation occurs when the environment in production is not to be owned by a single user account but by a service principal. I am glad to report that from this release onwards, including in Azure DevOps and GitHub actions, the assign user can now accept an application user (service principal) New!! Data import capability in Power Platform CLI For a long time, our users have been asking to get the Data migration tool into the Power Platform CLI. Now you can use the data import capability instead of the data migration tool. Just as move solutions across environments, the corresponding data for such solutions that use their data from Dataverse, we can now move them too. We have introduced the option to export and import and the data can be in a zip file or in a directiory Moving towards JUST –ENVIRONMENT no more URL or Envid, provide the one that suits with a single parameter In most of the commands recently you would have noticed that we are removing references to –url or –environmentid to just –environment. So, whether you pass the url or the id, the command will resolve itself from the information provided on whether the parameter passed is a url or an id. Keeping with that model, we have updated the pac application list and install commands with this capability now And other commands like solution list and connector list and many more are following a similar path. Going forward all you need to do is provide the –env flag and pass either the url of the environment or the id of the environment and the system will figure out the right one for you. Just added… the ability to provide runtime settings for packages During package deployment, although the API allows you to submit runtime settings for the package, such which solution to run first etc. We had unfortunately not exposed that, now we can do that with this release, thereby making a lot of folks who deploy power platform packages happy. And in addition, for plugin developers, you can now opt-in for strong name signing for your plugin library. As you can see, we have introduced a number of new capabilities in this update. As always we are looking forward to your feedback on these new capabilities. The forums are the same for feedback, please reach out to us via the following forums ISVFeedback@Microsoft.com or The PowerUsers community. Raise the issue and bugs at the following location in GitHub https://aka.ms/powerplatform-vscode

Power Apps brings collaboration to center stage with 3 big announcements

As we kick off the first ever Microsoft Power Platform Conference to a sold-out crowd buzzing with energy, the power of a thriving and collaborative low code community has never been clearer. Our major product announcements today reflect that truth. We’re excited to share three important new ways to collaborate – whether you’re building an app together with a team, looking for help inside your company, or taking Teams itself to the next level as a collaborative workplace.  What’s new in a nutshell (with a lot more detail below):  Cards for Power Apps are a completely new way to design and deliver mini-apps directly inside Microsoft Teams and Outlook using low code. By embedding essential micro-experiences directly in Teams, you can bring your conversation and workflow together like never before.  Coauthoring is coming to Power Apps as well, using the same realtime collab capabilities that millions of people are using in Office 365. This is a big step forward for software development, allowing multiple makers to collaboratively build and edit the same app simultaneously.  Maker matching, delivered with a new integrated virtual agent, will help guide makers to the learning content they need while also connecting them with experienced advisors from within their organization. In-context help together with well-connected internal communities means more successful low code solutions at scale. Cards for Power Apps Cards for Power Apps is a new addition to the low code lineup, allowing business users and developers alike to build visual, interactive, data-driven, and actionable cards. These embeddable micro-apps can be shared with other people in Teams and Outlook with a customizable visual interface.  Cards are lightweight micro-apps created in low code and embedded into Teams Cards for Power Apps is based on the popular Adaptive Card framework, used by professional developers to serve millions of users with pre-built actionable cards today. By bringing this capability to low code, we’re dramatically broadening who can build and deliver lightweight, embedded experiences. Get started in minutes with a no-code card designer  The drag-and-drop card designer has all the same familiar elements of Power Apps. Makers can create a customized card UI consisting of buttons, tables, text, images, options, checkboxes, input fields, containers, and many other components without writing any code. Add data and connect to services  The Power Platform Connectors in the card designer will allow users to connect to enterprise data and back-end services safely and securely, filling the card with relevant data when a user sees it in Teams. Of course, Microsoft Dataverse is built right in as well, making it possible to easily build lightweight cards backed by a powerful enterprise data platform.  Cards use all the familiar tools of Power Apps, including connectors and Power Fx Add business logic with Power Fx  Support for Power Fx – the open-source low code programming language based on Excel – makes fast inline calculations and dynamic actions possible in cards. Makers can create sophisticated logic, including data operations performed via Power Platform Connectors, connected to interactive UI elements in the card. Send your cards in Teams  Users can post their cards in Teams Channels, chats and meetings. Links to cards will automatically render into a full interactive card inline, filled with data, all while keeping enterprise data securely within an organization. Cards can be easily shared in Teams Cards are rolling out now in public preview to Power Apps makers and should light up in your region soon. The first wave will build on Dataverse as a backend and use Teams as the target for sharing; we’ll be rapidly lighting up Power Platform connectors and additional sharing options over the coming months. We hope you love this amazing new capability to build embeddable, interactive, and actionable cards in Power Apps and cannot wait to hear your feedback. Happy card building!   Coauthoring in Power Apps Developing software has traditionally been a “single player” experience. Developers check out code to a local branch, make their changes individually, and merge back to the main project. Until now, the vast majority of low-code development has followed a similar pattern of one maker per component at a time.  With coauthoring, all of that changes dramatically. Now multiple makers will be able to see who else is present in the app, see what parts of the app other makers are working on, and see changes made to the app in real-time. Makers can add new pages and work on them without disturbing others’ work. This dramatically improves collaboration and development efficiency, no matter whether you’ve been a software developer for your entire career or are just getting started with low code. Coauthoring in Power Apps works like Office 365 documents Just like Office 365, makers can see who else is building in their app and easily chat with them. You can tag people in comments for asynchronous communication, or everyone can open an app and watch each other make changes together in the real-time. The difference is that it’s not one person sharing the screen, it’s everyone seeing the same version of the app and having the ability to add/edit/remove app components without disturbing work of other makers. We’ve heard from many customers that they want to use coauthoring to help new makers grow within the organization. With coauthoring, novice makers can shadow experienced makers to watch them build an app or complete the tasks that experienced maker set up for them. Whether you have a new cohort of developers or your marketing department wants to learn how to build internal apps, you can start a project and teach them in a single environment.   Coauthoring is rolling out now for makers using the modern app designer.   Maker matching with an integrated virtual agent in Power Apps To assist makers in their learning and development journey, we are introducing an integrated virtual agent in Power Apps that can guide makers to Power Apps learning content as well as internal resources specific to an organization. But the special thing about this bot is that it isn’t just a bot – Power Apps will also match users with more experienced makers from their own organization, right within the product. Maker matching connects to people who can advise new makers during development – enabling people to build apps better and faster with help that is ‘close to home’. The virtual agent also surfaces internal community channels and resources to help you build and stay connected to your maker community. Maker match connects makers with in-context help and qualified advisors from within their organization Experienced makers can be nominated as Advisors into the Advisor Program in one of two ways. Admins can assign specific individuals as advisors. Over time successful makers with strong track records will also automatically receive in-product invitations through intelligent matching. Once nominated, advisors will then be prompted in Power Apps to opt into the Advisor program and they can start helping new makers in their organization. The virtual agent will begin matching makers soon in public preview. What’s next? You can try all these features and more by signing up for a free developer plan to build apps or a free trial to use apps that other makers have built.  Want to do more with premium connectors and Dataverse? Check out paid plans. 

Power Fx: Introducing Named Formulas

I am thrilled to introduce you to an old concept. A powerful concept that has been in Excel for a very long time, that we are now bringing to Power Fx. With it, you can simplify your app’s initialization, reduce app load time, reuse logic, and improve the maintainability of your apps. Welcome Named Formulas. A funny name, that derives from how this feature appears in Excel, in the “Name Manager.” In Excel, you can name any cell and refer to that name throughout the workbook. It adds a level of indirection that allows you to move that cell reference by name rather than by location. And inspiring the introduction into Power Fx, it also allows you to bind a name to a formula that isn’t in a cell. Today in Power Fx, you write formulas for the properties of the controls in your app. You can refer to those properties from other formulas to reuse the result of calculation. But you are limited by the controls and their properties. What if you could effectively create your own properties, create your own points of reuse? Advantages of Named Formulas Enough theoretical preamble, what if you could write this in an App.Formulas property: UserEmail = User().Email; UserInfo = LookUp( Users, 'Primary Email' = User().Email ); UserTitle = UserInfo.Title; UserPhone = Switch( UserInfo.'Preferred Phone', 'Preferred Phone (Users)'.'Mobile Phone', UserInfo.'Mobile Phone', UserInfo.'Main Phone' ); These are formulas in the truest sense of the word. They express how to calculate the UserEmail, UserInfo, UserTitle, and UserPhone from other values, much like F = m * a in Physics calculates force. This logic is now encapsulated and can be used throughout the app and can be updated in this one location. It can be changed from using the Dataverse Users table to using the Office 365 connector without needing to change formulas in the rest of the app. These formulas don’t say anything about when these should be calculated or how these should be calculated. They are truly declarative. They provide a recipe only. Ok, you are probably wondering, why use this? Why not just use Set in App.OnStart to accomplish the same thing? You certainly can and we have no intention to ever take that ability away. State variables and Set will always have their place. Named formulas have advantages: The formula’s value is always available.  There is no timing dependency, no App.OnStart that must run first before the value is set, no time in which the formula’s value is incorrect.  Named formulas can refer to each other in any order, so long as they don’t create a circular reference.  They can be calculated in parallel. The formula’s value is always up to date.  The formula can perform a calculation that is dependent on control properties or database records, and as they change, the formula’s value automatically updates.  You don’t need to manually update the value as you do with a variable.   The formula’s definition is immutable.  The definition in App.Formulas is the single source of truth and the value can’t be changed somewhere else in the app.  With variables, it is possible that some code unexpectedly changes a value, but this is not possible with named formulas. That doesn’t mean a formula’s value needs to be static – it can change – but only if dependencies change. The formula’s calculation can be deferred.  Because its value it immutable, it can always be calculated when needed, which means it need not actually be calculated until it is actually needed. If the value is never used, the formula need never be calculated.  Formula values that aren’t used until screen2 of an app is displayed need not be calculated until screen screen2 is visible.  This can dramatically improve app load time.  Named formulas are declarative and provide opportunities like this for the system to optimize how and when they are computed. Named formulas is an Excel concept. Power Fx leverages Excel concepts where possible since so many people know Excel well.  Named formulas are the equivalent of named cells and named formulas in Excel, managed with the Name Manager.  They recalc automatically like a spreadsheet, just like control properties do. Implications for OnStart Last year, we introduced the App.StartScreen property as a declarative alternative to using Navigate in App.OnStart. It has been very successful, today App.StartScreen is used much more than the old pattern. At the time, I explained that there are three main reasons for using OnStart: Indiciating which screen should be first. Setting up global variables. Prefetching and caching data. With the named formulas, we are now addressing the second item on this list. The third item is still being worked on and the subject of a future discussion. People love to use Set in their OnStart. A recent study showed that when App.Onstart is used, 84% of the properties includes a Set. I get it. I personally use it. I championed the addition of Set years ago. Until now, this has been the only way to setup a value to be reused across your app. For example, I’ve written a chess app that heavily uses App.OnStart: Many of the Set calls are to setup simple constants. The Board is represented as a string with metadata at the end, an unpacked form of chess FEN notation. Set( BoardSize, 70); Set( BoardLight, RGBA(240,217,181, 1) ); Set( BoardDark, RGBA(181,136,99, 1) ); Set( BoardSelect, RGBA(34,177,76,1) ); Set( BoardRowWidth, 10 ); // expected 8 plus two guard characters for regular expressions Set( BoardMetadata, 8 * BoardRowWidth + 1 ); // which player is next, have pieces moved for castling rules, etc Set( BoardBlank, "----------------------------------------------------------------_00000000000000" ); Set( BoardClassic, "RNBQKBNR__PPPPPPPP------------------------_--------__pppppppp__rnbqkbnr__0000000000" ); This is easy to translate to named formulas: BoardSize = 70; BoardLight = RGBA(240,217,181, 1); BoardDark = RGBA(181,136,99, 1); BoardSelect = RGBA(34,177,76,1); BoardRowWidth = 10; // expected 8 plus two guard characters for regular expressions BoardMetadata = 8 * BoardRowWidth + 1; // which player is next, have pieces moved for castling rules, etc BoardBlank = "----------------------------------------------------------------_00000000000000"; BoardClassic = "RNBQKBNR__PPPPPPPP------------------------_--------__pppppppp__rnbqkbnr__0000000000"; Note that none of the references to these properties needs to change. You can cut and paste from App.OnStart to App.Formulas, modify the syntax appropriately, and that’s it. Wherever BoardClassic was being used, it can continue to be used as it was before. Also note that the order of these definitions is no longer important, so the definitions of BoardRowWidth and BoardMetadata can now appear in any order. Let’s look at a more advanced case. In App.OnStart I have this imperative logic: If( !IsBlank( Param( "TestPlay" ) ), Set( PlayerId, Lower(Param( "TestPlay" ) )); Set( AdminMode, true ), Set( PlayerId, Lower(Left( User().Email, Find( "@", User().Email )-1 )) ) ); If( !IsBlank( Param( "ReviewGo" ) ), Set( ReviewGo, true ) ); Instead of setting these variables, the formulas for how to calculate them can be specified in App.Formulas. This result is easier to read by decoupling the definitions of PlayerId, AdminMode, and ReviewGo: PlayerId = If( !IsBlank( Param( "TestPlay" ) ), Lower( Param( "TestPlay" ) ), Lower( Left( User().Email, Find( "@", User().Email )-1 ) ) ); AdminMode = !IsBlank( Param( "TestPlay" ) ); ReviewGo = !IsBlank( Param( "ReviewGo" ) ); Finally, no I didn’t implement a chess playing algorithm directly in Power Fx (yet), I instead am using the excellent open-source Stockfish chess engine running on an Azure VM. A custom connector is used to communicate with the VM which also manages the games between the players. In App.OnStart: Set( Players, ChessPHP.Players() ); Set( PlayerName, If( IsBlank( LookUp( Players.players, Lower(player) = Lower(PlayerId) ) ), playerid, LookUp( Players.players, Lower(player) = Lower(playerid) ).name & " (" & playerid & ")" ) ); In App.Formulas: Players = ChessPHP.Players(); PlayerName = If( IsBlank( LookUp( Players.players, Lower(player) = Lower(PlayerId) ) ), PlayerId, LookUp( Players.players, Lower(player) = Lower(PlayerId) ).name & " (" & PlayerId & ")" ); Very similar. But what’s great about this is that, until I actually show PlayerName somewhere, the web service API call for ChessPHP.Players() doesn’t need to happen. The app doesn’t take precious time up front during app load to get this information. Because of the formula, the app knows how to get the needed information when the time is right. I’m still converting my app to fully take advantage of named formulas. Ideally, I will get to a point where I can point to each piece of state and explain why it needs to be mutable and why it can’t be a formula. I expect that to be a handful of items, rather than the hundred or so state variables my app uses today. Named formulas definitely changes how you think about state and code reuse in your app. To reiterate, I’m not anti-state or anti-Set. Mutable state is essential and differentiates Power Fx from Excel. Set is here to stay. But I do believe it should be used more sparingly than it is today. When the Set hammer was our only tool, everything looked like a nail. We now have a new tool to explore and only real world experience will inform us on the best ways to use them both. Experimental for feedback And now it is your turn. Named formulas are an experimental feature in version 3.22091. The App.Formulas property will only appear if the Named formulas experimental feature is enabled in Settings > Upcoming features > Experimental. As an experimental feature, do not use named formulas in production apps. How will you use this new feature? What could be improved before it is enabled by default? As always, your feedback is very valuable to us. Please let us know what you think in the Power Apps experimental features community forum. What’s next If you follow Excel, you will know that they added the Lambda function a few years ago. A colossal step forward for Excel, it enables user defined functions to be written directly in the formula language. And they used the named formulas concept to do it. For example, here are a set of functions that recursively walk a tree of employees and their reports in Excel, using the Advanced Formula Environment add in for authoring: This is essentially named formulas with parameters, which become user defined functions. Or, alternatively, you can think of named formulas as user defined functions that have no parameters. We plan to do something similar. In fact, there is an early prototype version of it running in our Power Fx open source GitHub repo at https://github.com/microsoft/power-fx. They logic looks different because we prefer to have the parameters on the left hand side of the =, we are strongly typed which we have added similar to how TypeScript added types to JavaScript, and we have lambdas built in to many of our functions like Sum. We are experimenting, this all may change. But the structure of how this works is consistent with what Excel is doing: How does this relate to canvas components? Canvas components can provide pure functions today with experimental enhanced component properties feature. We are working to move this out of experimental, after having made canvas components generally available earlier this summer. Canvas components are great and support sharing across apps through a component library. However, they were designed primarily for a different use case, as a user defined control. Functions written in this way must be instanced as an object, appear as methods on that object, need to be placed on a UI canvas, and they are heavy weight to define using builder UI in Studio. Named formulas, and a future user defined functions, are much lighter weight, easier to create, and can be more easily copied from other apps, documentation, or web examples. They can also be leveraged across all Power Fx hosts with no canvas required. We have no firm dates on when this will become available in Power Apps but we know code reuse is very much on your minds, and so it on our minds too.

Power Fx: Introducing ParseJSON

We are pleased to announce the experimental release of our ParseJSON function, which can parse JSON strings into Power Fx objects and types. This new function dramatically simplifies working with JSON in Canvas apps. For example, we recently worked with a customer who needed to extract information from a relatively simple JSON string stored in a Dataverse table: [    {         "PredictionID": "e438ec93-dee2-4a6c-92d1-8c9e87a1b1d3",       "CustomerID": "2c9c3dae-5113-4b6b-9cc5-ad25b955b463", "id": "b8cf0ea6-b6f5-46c3-9fc2-5403fb7fdd2d",       "Score": 0.7848321,         "ProductID": "628065",     "ProductName": "Large Widget" }, {     "PredictionID": "5ac7b6aa-d069-4c2d-b593-bda5bf3e2f56",       "CustomerID": "2c9c3dae-5113-4b6b-9cc5-ad25b955b463"         "id": "b8cf0ea6-b6f5-46c3-9fc2-5403fb7fdd2d",         "Score": 0.82974886, "ProductID": "527174",   "ProductName": "Small Widget"    }, ... Their first solution was to use a Microsoft Power Automate flow to do the parsing which has excellent JSON support. However, the volume and latency of these calls didn’t work for their scenario. So, like many of you, we turned to regular expression matching using this formula: MatchAll( ProductRecommendationsJSON, """Score""s*:s*(?[^,]*)s*,s*" & """ProductID""s*:s*""(?[^""]*)""s*,s*" & """ProductName""s*:s*""(?[^""]*)""", MatchOptions.Multiline ) This works and is what the customer shipped their app with. However, it is hard to write and verify with that complex regular expression syntax and is fragile, for example it will break if the order of the JSON fields is changed or another field inserted. A better answer is a dedicated JSON parser, a highly requested feature on our community ideas forum. With the new ParseJSON function, this can be written as: ForAll( Table( ParseJSON( ProductRecommendationsJSON ) ), { Score: Value( Value.Score ), ProductID: Text( Value.ProductID ), ProductName: Text( Value.ProductName ) } ) This is both easier to read and more robust, tolerating reordered and additional fields. We have plans to further simplify this with better untyped object array support and casting as a whole via a schema rather than field by field. But this will get you started and will continue to be supported with those enhancements. Note that this is an experimental feature. This functionality will change. We are releasing it now for your feedback and for use in proof-of-concept, non-production situations. To use this functionality, you must opt-in to the feature under Settings > Upcoming features > Experimental. Please post your feedback in our experimental features forum. To learn more, keep reading, and check out our Working with JSON documentation. JSON and Power Apps The JSON format has seen widespread adoption in serializing application objects for systems communications and saving of state. Many services today offer RESTful services that communicate via JSON payloads, including services such as SharePoint and Azure DevOps. Power Apps provides a large amount of out of the box connectors, many of which talk to services via JSON and provide Power Fx types as input and output. But there are cases where either a service can provide very dynamic JSON payloads, or the data is provided as text but in practice contains JSON objects. In 2020 we released an experimental feature called Dynamic Schema which addresses specific scenarios such as custom fields on Azure DevOps work items. The standard connector for Azure DevOps can only know about standard Azure DevOps fields, but a “capture schema” feature allows a maker to have Power Apps capture the output of the connector call and adapt the schema based on what the output provides. A maker can subsequently work with the connector in Power Fx as if the fields were always part of the connector’s schema. This is a fairly static “update” to the schema that can be made when authoring the app. Today’s experimental release of ParseJSON addresses the other end of the spectrum, and provides an important base in the Power Fx language and runtime to bridge the areas in between. Untyped Object To handle the most dynamic scenarios with a JSON string, we need to address the fact that Power Fx is a strongly-typed language. The JSON format, in comparison, offers very few data types. Additionally, in a scenario where the JSON payload can change when the Power App is run (as opposed to when it is built) we must have the ability to read the JSON and convert into the types we need from the formulas that need them. To support this most dynamic of scenarios, we have introduced a new type called untyped object. It is, in effect, a wrapper for a value or object that can be converted to a concrete Power Fx type at the time the app runs. When authoring the app, makers can use the untyped object to write formulas that make assumptions about the nature of the actual object or value when a user will use the app. This does mean the maker has some knowledge about the incoming data and its structure, as this metadata is not available during authoring and the untyped object cannot provide any help (such as IntelliSense). Untyped objects can contain values (numbers, text, Boolean, dates, etc.), records and tables. To use any underlying values of untyped objects, they have to be cast explicitly to their respective Power Fx types using the type construction functions such as Boolean (a new function), Text, Value, DateTimeValue, etc. ParseJSON The ParseJSON function will be the first (and for now the only) function to return an untyped object. It accepts a text value that is a valid JSON string. For example, assume a custom connector “myConnector” that provides a GetJSONString() function: ParseJSON( myConnector.GetJSONString() ) Now assume the connector returns the following JSON: {    "TextField" : "Hello, World!",    "Version" : 1.1} We can store the untyped object that ParseJSON returns in a variable called “untypedVariable”, and access the individual fields with the regular dot notation: Set( untypedVariable, ParseJSON( myConnector.GetJSONString() ) )untypedVariable.TextFielduntypedVariable.Version However, the fields on the untypedVariable (TextField, Version) are also untyped objects and have to be explicitly converted to a type that can be used in formulas for Power Apps properties. For example, to use the TextField in the text property of a label control, the Text() function has to be used to convert the untyped object value to a text value: Text( untypedVariable.TextField ) Similarly, a JSON array of values or records can be converted to a table directly with the Table function. That will result in a single-column table of untyped objects, requiring each untyped object value or record field in the table to be converted to a proper type. For example, we can parse the following JSON string into an untyped object variable named “untypedVariable”. [    { "IndexField" : 1, "Title" : "One" },    { "IndexField" : 2, "Title" : "Two" }] We can now convert this JSON array of records into a table with the Table function: Table( untypedVariable ) If you wish to use this table in, for example, a collection and gallery, you have two options. Either use the table as-is, and in the gallery convert any property values to specific Power Fx types. Or, convert the table to a typed table with ForAll prior to assigning to the gallery. To create the collection of untyped objects, you can use the following formula: ClearCollect( collection, Table( untypedVariable ) ) Inside the gallery using this collection, each field will need to be converted into the correct Power Fx type: Text( ThisItem.Title ) To convert the untyped object array directly into a typed Power Fx table, you can use ForAll which can return a table: ClearCollect( collection, ForAll( Table(untypedVariable), { IndexField: Value(Value.IndexField), Title: Text(Value.Title) } ) ) Now the gallery can use the collection with an entirely typed record. For more information and examples, see the Working with JSON article on the docs site. What’s Next The ParseJSON function with untyped objects provides a very generic way to handle dynamic incoming JSON through Power Fx formulas. We are considering next steps to bridge the scenarios in between dynamic schema capture during authoring, and untyped objects in formulas. Imagine declaring your own schema through Power Fx formulas or YAML definitions, and having connectors or ParseJSON automatically convert the incoming payload to the Power Fx typed schema you defined. This would provide a broad spectrum of options from capture schema from a sample payload, to defining schemas that can automatically convert, all the way to code-first handling with untyped objects. While we work out what, how, and when we can introduce this spectrum of features, we plan to move the current ParseJSON and untyped object features to preview and GA quickly as we believe the flexibility it provides will allow makers to address their JSON parsing needs. Additionally, untyped objects provide a basis for other potential features and functions that deal with external data structures, such as CSV, XML, etc. As always, we value your feedback and suggestions on the currently released features and future roadmap. Join the discussion in our experimental features forum.

Power Platform Administration Planning

Steve Jeffery, Principal PM, Power CAT, Thursday, September 8, 2022 Behind the scenes, your IT and Center of Excellence team spends time configuring, managing and nurturing the adoption of Microsoft Power Platform. Understanding how that time is spent can help you plan who you need on your team and find the highest-impact opportunities to streamline the administrative effort. We’ve put together a simple solution (Power Platform Administration Planning) that is designed to help you better: Plan your team structure Review where you spend time and look for automation or innovation opportunities The solution is built on Microsoft Dataverse and is a new stand-alone component in the CoE Starter Kit. Admin tasks are defined in a model-driven app and insights are provided in a Power BI dashboard. You can start either by adding your own tasks or by importing a set of example tasks from an Excel spreadsheet, populated with some of the most common administration tasks covering: AI Builder Environments and connectors Power Apps Power Automate Power Pages Power Virtual Agents You’ll need to review the tasks and populate task metadata to get the best out of the Power BI dashboard. Plan your team structure If you’re getting started, you might find importing the sample tasks spreadsheet a great starting point for inspiration. Where possible, we added in the most common administrative tasks that administrators perform and supplied links to supporting/ instructional documentation. Task metadata is used by the dashboard to provide useful information about your team structure and the level of expertise required to complete your administrative work. Review where you spend time You may already be administrating the platform and looking for ways to increase your maturity, looking for automation or innovation opportunities. When you add your own estimation of which tasks you need, how much time you’ll spend doing them, and who will do them, the dashboard will indicate if the size of your team is sufficient, roles and experience levels and an estimate of how much time to expect to spend administering the platform. After your team has been doing the work for a while, you can update with the exact data. Admin tasks in the model driven app Tasks can be imported from the sample spreadsheet or manually created. Tasks have the following schema: Column Description Name A brief description of the task. E.g., ‘Create an environment.’ Task description Longer description of the task. Task documentation link URL (Uniform Resource Locator) to documentation Active task Yes/ no – is this a task that you currently perform? Outsourced task Yes/ no – is this task outsourced? Automation Yes/ no – is this task automated? Frequency Choice – how often is this task performed? Anticipated task iterations Number – how many times do you expect to perform this in one year? Duration Number – how long, in minutes does this task take? Experience required Choice – what level of ability is needed? Core admin persona Choice – which core admin persona usually performs this task? Peripheral admin persona Choice – which peripheral admin persona is involved in this task? Primary task category Choice of task categories Secondary task category Choice of task categories Product or service Choice of Power Platform applications Power BI dashboard The structure of the dashboard is designed to help you focus on what you do, how reactive your team is, and the impact that automation and outsourcing (if your organization does) has on your overall efficiency. Team, outsourcing and automation Team, outsourcing and automation aggregates administrative task data, indicating: Team workload – enter the number of staff in your team. Based on aggregated effort, the required hours (per team member) to complete the tasks are estimated. Outsourcing – what, if any, impact outsourcing is having on your workload. How many resources, and what level of expertise is required. Automation – This section provides insights into how much time you’re saving by automating tasks and the expertise that would be required. Task breakdown – proactive & reactive balance Task breakdown – proactive & reactive balance helps focus on the balance between tasks that are ad-hoc, or reactive. By displaying the percentage of tasks that have been categorised as ad-hoc, and providing a filtered list, it encourages you to look for automation, outsourcing or innovation opportunities. Team breakdown – experience & personas Task breakdown – experience & personas aggregates the duration for all tasks by frequency and provides insight on: Experience levels required – illustrates the experience required across all tasks. This can be useful in estimating training requirements for your existing team Insight for each ‘core admin persona’ & ‘peripheral persona’ – is useful to understand how many of your tasks rely on additional teams to complete. For example: creating an environment may also require Azure AD Security groups to be created for managing access. Task overview: experience, persona & categorization Administrative tasks are categorized, which is useful in understanding where your team spends the most time. Tasks have two categories to provide deeper insight. For example: selecting ‘Reporting’ will not only filter the list of administrative tasks to display tasks categorized with reporting, it will also display tasks by their secondary category. Especially useful to understand what type of reporting your team is focusing on, and how much time they are spending completing this. Where you can get it Microsoft Power Platform Administration Planning is a standalone module in the CoE (Center of Excellence) Starter kit, which means it’s open sourced and available for download from the same GitHub repository as the toolkit is. Setup guidance and further information is available. Watch the Power CAT Live! video where we go into more detail about this solution:

Announcing public preview of Content Security Policy for Power Apps

We’re excited to announce the public preview of Content Security Policy for Power Apps! Power Apps has had Content Security Policy (CSP) support for model-driven apps since the beginning of the year, which was configured by running script as a System Administrator. With these new capabilities, you can now control the CSP header for model-driven as well as canvas apps in the environment in Power Platform Admin Center. CSP can be configured in both enforced and report-only mode. Configuration in Power Platform Admin Center CSP can be configured using the Content security policy settings under the Privacy + Security section of an environment in Power Platform Admin Center. Turning enforcement on will provide protection against clickjacking attacks for apps in that environment. CSP is configured independently for model-driven and canvas apps, except for reporting which applies to both. Reporting and enforcement are disabled by default, and we recommend you turn on enforcement in your production environments only after testing your apps in a sandbox environment with CSP turned on to ensure any intended functionality isn’t blocked due to this change. We also recommend turning reporting-only mode on in production before enforcement to catch any lingering issues before enforcement is enabled. CSP support for canvas apps Model-driven apps have had the ability to send default and custom CSP for some time. With this update we’ll support CSP for canvas apps as well. The default and customizable pieces of the CSP header are the same for both model-driven and canvas, but they are configured independently, allowing you to perform a gradual CSP rollout. Violation reporting As part of the CSP settings, you can also enable reporting and provide a custom reporting endpoint to receive any content security policy violation reports. This capability helps preview what violations would be blocked before turning it on completely. Refer to the Content Security Policy documentation for details on building reporting endpoint. Please review the documentation for more details and as always, we would love to hear from you on how we could keep improving this feature. Please leave your feedback and comments on this post.

3 Primary Reasons to Learn Dataverse

Audrie Gordon, Solution Architect, Power CAT, Tuesday, September 6, 2022 If you’ve been on the fence about learning Microsoft Dataverse for your Power Platform solutions, then this blog post is for you. We will explore the most proven motivators for learning, as well as using, Dataverse. We’ll cover three topics: Data Stewardship, Security and Integration. The most inspiring reason for learning Dataverse is the breadth of capabilities you can use to set strong standards for data stewardship. As per Wikipedia: “Data Stewardship means the formalization of accountability over the management of data, and the data-related resources. So, while data governance programs set the rules, data stewardship oversees the smooth implementation of those rules.” Therefore, data stewards seek a comprehensive approach to data management to ensure the quality, integrity, accessibility and security of the data. How does Dataverse help? Dataverse is designed to be more than just a database. It can also include data from other data sources, for example through virtual tables. Everyone can take advantage of the built-in Common Data Model (CDM) tables designed to support strong communications between you and the businesses you partner with. Of course, you can also easily create custom tables, views, and forms. But don’t stop there! Take the data model to the next level by layering business logic, rules, and process flows to maintain data integrity and guide participants through important process steps or milestones. This end-to-end approach of optimizing enterprise data models enables both solution makers and business analysts to use, and to share data, with confidence. Get started right away extending the value of tables and leveraging business rules with the new “Formula Fx Column“. The Formula Fx Column enables us to use Excel-like expressions within table columns (aka Power Fx). In the solution below, I’ve added a column to check when the current contract value exceeds 30% of the original bid value. Now Dataverse will trigger process alerts through Power Automate when this occurs. Since the logic is built into the data, all the Makers (new and old) will automatically gain this logic when they build apps using this table; no matter what type of app they build! TIP: Leverage Formula Fx columns to drive consistency in process and notifications Microsoft consistently prioritizes security and customer trust. Anyone can learn more about Microsoft’s Security and Trust commitment, or specifically about security and compliance for the Power Platform. So, why do I call this out as one of the three reasons for learning Dataverse? Because Dataverse brings security to the next level with its scenario-focused approach that facilitates a wealth of data visibility, security, compliance and auditing. Dataverse manages these through several layers of controls making it a platform with security on steroids. Some of the most commonly used layers of security include, but are not limited to: The Environment: The environment itself is the root container in the tenant for Dataverse. So it’s easy to assign a Security Group (“SG”) to an environment. This will constrain the environment contents (such as database tables) to members of that SG. Column-Level Security: Each column within a record can be configured for column-level security. Now we can decide to share all Customer Account details with the Sales team, but restrict access related to contract value and invoicing to only the Finance team. Role-Based Security: Dataverse uses role-based security (RBS) to group together a collection of privileges. These security roles can be associated directly to users, or they can be associated with Dataverse teams and business units. Users can then be associated with the team, and therefore all users associated with the team will benefit from the role. My favorite thing about this is that you can create roles and then insert them into the Solution Package so they can easily be reused there or in other solutions. This layered approach to security and record visibility supports the diversity of requirements needed for common business scenarios. It’s not just about ‘who can access what data’, it’s also about facilitating need-to-know visibility by combining layered security with filters and views to aid in discovery, yet reduce noise in a people-friendly methodology. Note the roles and the people in the image below. They are all working on a construction project but they have varied data access constraints and requirements. Dataverse can ensure that each individual gets to what they need to know, when they need to know it. TIP: Simplify, and reuse, Security Roles by storing them in Power Platform Solutions: Not all of our data starts or ends in Dataverse. Dataverse is designed to help you to orchestrate all your enterprise data needs, no matter where that data is stored. In many cases we will want to migrate, synchronize with, or simply just view data virtually within Dataverse. Both migration and synchronization occur easily using Dataflows. Dataflows are a self-service, cloud-based, data preparation technology. Dataflows enable customers to ingest, transform, and load data into Dataverse environments, Power BI workspaces, or your organization’s Azure Data Lake Storage account. Customers can trigger dataflows to run either on demand or automatically on a schedule; data is always kept up to date. But wait, you don’t always have to move or synchronize data! In many cases, having a “virtual” table is the best choice for leveraging the data directly from the source. Any business user can create virtual connections to data external to Dataverse. Check out the new virtual table providers for SQL, SharePoint, and Excel for example. Thanks to virtual table providers, we can now take advantage of data outside of Dataverse to layer it into our solutions, or to enable more complex scenarios that require modern technologies such as Artificial Intelligence (AI), Machine Learning (ML), Internet of Things (IoT), Azure functions, extended compute power, and/or dynamic query-driven tables. TIP: Optimize solutions by layering data sources that you rely on every day using virtual tables My favorite service integrations are those related to optimizing Business Intelligence (BI) insights (such as with Power BI), and the wide selection of Azure service partnerships, such as with Azure Synapse. Azure Synapse extends both compute power and our ability to create dynamic table queries through the use of Spark or SQL select statements. Both existing Dataverse tables and query tables created in Azure Synapse, provide creative opportunities for visualizations and insights in Power BI (learn more in this demo). TIP: Take advantage of Spark and SQL Select statements along with the enhanced compute power of Azure Synapse Conclusion There is definitely a broad return on investment when it comes to learning Dataverse. Data Stewardship to Scale – helps us to reuse data and set standards across our business Granular Security – secures data across the domain, across tables, including column and role based security Entended Integration Powers – enables us to bring data from anywhere we need it into our solutions Get started learning today: Microsoft Learn for Dataverse Additional resources: Security Concepts Power FX and Business Rules Virtual Table Connector Providers Azure Synapse Link for Dataverse

Extend modern commands with custom pages and geospatial mapping

Buttons and command bars control the core behavior for any application. Without them, we can’t print the latest report, start our time-sensitive process, or make our hero storm the castle and save the day in our favorite video game. In Power Apps Model-Driven Apps, they are everywhere, and this blog will show you how to leverage geospatial features with modern commands, custom pages, model-driven apps, and a little bit of JavaScript. Use commanding to display a custom control image Walkthrough In this walkthrough, we will be using custom pages, map control, and app notifications for plotting a contact’s address on a map. Map for an individual record image What if you wanted to plan your trip around town to visit clients to be more effective and review the relative locations of your clients? And what if you were asked to do this by your CEO or VP and needed to let them know it was ready to view and where to find this map in your model-driven app?  You could build them a beautiful app that looks something like this. Send notification to CEO or VIP with link to the map image Setup We need to create a custom page with a map control that will read contact data. What ingredients do we need to build all this? A solution, model-driven app, contacts, canvas app, custom page and JavaScript. Prerequisites As an admin, go to the admin center, turn on geospatial controls and turn on Bing Maps. Detailed steps are here  Add geospatial controls to canvas apps and Manage Bing Maps for your organizations Enable maps Let’s start with creating a solution, model-driven app, and contacts From your Power Apps portal, go-to solutions and create a new solutionCreate a new model-driven appOpen the model-driven appAdd the contacts table to your model-driven appGo to contact and enter new contacts with addresses using the main Contact form. Enter City, State/Province, Country/Region For more information Create a model-driven app using the account page Create canvas app Why do we need a canvas app? You can copy and paste controls that aren’t shown on a custom page from a canvas app. In this case, we want the map control From the solution, add a blank canvas app. See Create a blank canvas app from scratchAdd a data table that uses contacts as the data source. See Data table control in Power AppsAdd the map control. See Use an interactive map control in Power AppsEnable the Show current location property and use the formula bar to set the CurrentLocationLongitude and CurrentLocationLatitude So that when the user selects a row, the map highlights the location with a blue circleWhen you’re happy with formatting the data table and map. Select the controls and on your keyboard, use CTRL+C. See Copy and Paste controls across Canvas Apps available Map advanced current latitude and longitude properties image Create a custom page From your solution open the model-driven app.Click the top + Add page button. See Add a custom page to your model-driven appWhile on the custom page use CTRL+V to add the data table and map to your custom page. See Copy and Paste controls across Canvas Apps availableOpen your map from a button From your solution open the model-driven app and navigate to command designer. When prompted leave the default to the Main grid. See Open the app designerWhen prompted select to use JavaScriptFrom the left pane, add a new dropdown using the +New buttonYou will see a group is added by defaultFrom the left pane select the group and using the same +New button create a command button under the groupCreate a local JS file and add the JS script below to open a centered dialog window.Click + Add library linkClick + new web resourceUpload your file and enter all the fields.Select your new libraryEnter the name of the function in this case openCustomPage. See Use commands with custom pages with cloud flowsEnter the custom page logical name as the first param and the page title as the second param. See Use Javascript for actions and Finding the logical name For more information see Create and edit web resources Example code for openCustomPage sample function function openCustomPage(customPageLogicalName, customPageTitle) { // Opens a centered custom page dialog let pageInput = { pageType: "custom", name: customPageLogicalName, }; let navigationOptions = { target: 2, position: 1, width: { value: 50, unit: "%" }, title: customPageTitle, }; Xrm.Navigation.navigateTo(pageInput, navigationOptions) .then(function () { // Called when the dialog closes }) .catch(function (error) { // Handle error }); } Add send notification button To get this working we are going to need to get the system user id. You can use OData to get the system user id quickly using this snippet. As a test, you can use your own system user id before sending it to someone. See User (SystemUser) table/entity reference Example query /api/data/v9.0/systemusers?$select=fullname&$filter=startswith(fullname,’Alfredo C’) While still on the command designer, select the group againAdd another command using the +New button create command buttonAdd the below JS as a web resource just like you did above and this time the function name is sendNotification.Enter the system user id of the person to see the notification as the first param. Enter the page title as the second param and the custom page URL for the third param.When you are done save and publish your changes Example URL  ?appid=0b02a3a4-16da-ec11-bb3b-000d3a33d9bf&ribbondebug=true&pagetype=custom&name=cr1c6_contactslocations_24f66 Example code for send notification function function sendNotification(systemuserid, customPageTitle, customPageUrl) { var notificationRecord = { "title": "Congratulations", "body": "You can review the location of your contacts", "ownerid@odata.bind": "/systemusers(" + systemuserid + ")", "icontype": 100000001, // success "data": JSON.stringify({ "actions": [{ "title": customPageTitle, "data": { "url": customPageUrl } } ] }) } Xrm.WebApi.createRecord("appnotification", notificationRecord). then( function success(result) { console.log("notification created with single action: " + result.id); }, function (error) { console.log(error.message); // handle error conditions }); } 🥳 Congratulations! Now you have your map and a way to notify your CEO or VP that it’s available. The next time they open the app they will see the notification and can click on the link to review the map.  Note that the JS functions are reusable, and can be applied for different tables such as Accounts, Organizations, etc. You can discover your own scenarios. One example is you are planning a conference with different event locations. Team Credits Huge thanks to the commanding engineering team. Alfredo Cerritos LinkedIn, Anshul Jain LinkedIn, Brad Flood LinkedIn, Casey Burke LinkedIn, Prabhat Pandey LinkedIn, Sanket Patadia LinkedIn, Sergio Escalera Costa LinkedIn, Srinivas Dandu LinkedIn Thanks to Scott Durrow LinkedIn, Adrian Orth LinkedIn for collaboration
- Advertisement -spot_img

Latest News

- Advertisement -spot_img