{"id":5354,"date":"2025-06-20T17:46:57","date_gmt":"2025-06-20T17:46:57","guid":{"rendered":"https:\/\/redmonk.com\/jgovernor\/?p=5354"},"modified":"2025-07-18T15:16:01","modified_gmt":"2025-07-18T15:16:01","slug":"microsoft-build-2025-agents-models-github-and-beast-mode-windows","status":"publish","type":"post","link":"https:\/\/redmonk.com\/jgovernor\/microsoft-build-2025-agents-models-github-and-beast-mode-windows\/","title":{"rendered":"Microsoft Build 2025 &#8211; agents, models, GitHub, and beast mode Windows."},"content":{"rendered":"<p><a href=\"http:\/\/redmonk.com\/jgovernor\/files\/2025\/06\/native-support-for-MCP-on-Windows.jpg\"><img loading=\"lazy\" decoding=\"async\" class=\"aligncenter size-large wp-image-5355\" src=\"http:\/\/redmonk.com\/jgovernor\/files\/2025\/06\/native-support-for-MCP-on-Windows-1024x768.jpg\" alt=\"image showing Satya Nadella on stage in front of a slide saying &quot;native support for MCP on Windows&quot;\" width=\"1024\" height=\"768\" srcset=\"https:\/\/redmonk.com\/jgovernor\/files\/2025\/06\/native-support-for-MCP-on-Windows-1024x768.jpg 1024w, https:\/\/redmonk.com\/jgovernor\/files\/2025\/06\/native-support-for-MCP-on-Windows-300x225.jpg 300w, https:\/\/redmonk.com\/jgovernor\/files\/2025\/06\/native-support-for-MCP-on-Windows-768x576.jpg 768w, https:\/\/redmonk.com\/jgovernor\/files\/2025\/06\/native-support-for-MCP-on-Windows-480x360.jpg 480w, https:\/\/redmonk.com\/jgovernor\/files\/2025\/06\/native-support-for-MCP-on-Windows-107x80.jpg 107w, https:\/\/redmonk.com\/jgovernor\/files\/2025\/06\/native-support-for-MCP-on-Windows-836x627.jpg 836w, https:\/\/redmonk.com\/jgovernor\/files\/2025\/06\/native-support-for-MCP-on-Windows.jpg 1179w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><\/a><\/p>\n<p><span style=\"font-weight: 400;\">Microsoft did an excellent job at its Build conference last month, showcasing its strengths and ambitions for the AI era.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">It\u2019s always hard to summarise Build, because the company has such a long history and a number of different developer and ops centers of gravity &#8211; including Microsoft Azure (cloud deployment and management), Developer Division &#8211; aka DevDiv &#8211; and GitHub (developer tools and platforms), Power Platform (low code and AI powered business applications), Windows (desktop and server infrastructure). From year to year one division or another will get more attention than others, depending on the vagaries of Big Launches. Then of course there are the cross-cutting concerns &#8211; issues which impact the company across its entire portfolio &#8211; open source and AI are good examples.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Everyone gets their moment on the keynote stage after Satya Nadella kicks things off on day one, which can make things feel hurried, where storytelling suffers. What Microsoft wanted us to come away with this year was the emergence of what it is calling The Agentic Web, the idea that AI-based agents are going to fundamentally remake how we use technology to conduct business.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">While a lot of tech companies are currently proposing AI agent platforms as thinly disguised (sometimes not disguised at all) replacement for humans, Microsoft is keen to thread the needle of agents as platforms to enable humans to get their work done more effectively. Recent and future layoffs may not help that message, but it\u2019s the message nonetheless.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">After reflecting on the event (for quite a while, apparently), I am ready to tell some stories from Build as I see them.\u00a0<\/span><\/p>\n<p><b>The GitHub Embrace<\/b><\/p>\n<p><span style=\"font-weight: 400;\">Let\u2019s start with DevDiv and GitHub &#8211; a key takeaway was that this was the most integrated Build we\u2019ve seen yet from Microsoft and GitHub. They set the tone from minute one of the conference, introducing Seth Suarez, Microsoft principal program manager and Kadesha Kerr of GitHub as co-hosts, to introduce the winners of this year\u2019s Imagine Cup, and then handing over to Microsoft CEO Satya Nadella for his keynote. Presentations and demos interwove GitHub seamlessly into the keynote content and the conference content more generally. So the framing was pretty clear &#8211; if GitHub and Microsoft are closer than ever at Build, you can rest assured that they\u2019re closer than ever the rest of the time too.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Which is to say &#8211; one important implication of the AI revolution is that the arm\u2019s length relationship between Microsoft and GitHub is going to be\u2026 less arm\u2019s length. The integration isn\u2019t just going to be at Build keynotes. As Microsoft gears up for a more competitive, winner take all, more cut-throat era, GitHub is a key asset, and is going to be treated as such.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Microsoft has kept GitHub at relative arm\u2019s length since acquiring the company in 2018, which has made a lot of sense (up until now). It\u2019s effectively an independent subsidiary. Microsoft has given GitHub time to grow up, with its own distinct culture, approach to UX and developer experience largely intact, while at the same time lighting a fire under the company from a product delivery perspective, with an infusion of its own people (see Nat Friedman and Thomas Domke, both Microsofties, but importantly company founders in their own right &#8211; see Xamarin and Hockeystick).\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">GitHub ships now. Perhaps not at the relentless pace of some startups, but fast enough to get ahead of the market, and respond to threats. The Microsoft-GitHub combination came into its own by leading the current AI wave crashing onto the shores of the industry. It\u2019s worth remembering that GitHub Copilot launched in October 2021. OpenAI launched ChatGPT just over a year later in late November 2022. And now in 2025 with the emergence of AI-native editors like Cursor and Windsurf and a slew of agent platforms, <\/span><a href=\"https:\/\/redmonk.com\/jgovernor\/2025\/02\/21\/ai-disruption-code-editors-are-up-for-grabs\/\"><span style=\"font-weight: 400;\">everything is up for grabs<\/span><\/a><span style=\"font-weight: 400;\">.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">There are some that would argue that Microsoft and GitHub didn\u2019t respond quickly enough to the emerging threats, but I would argue that\u2019s overblown. Things are moving extremely quickly in the industry right now. Responding to threats doesn\u2019t always mean heading them off entirely.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Peacetime assumptions are behind us &#8211; competition is fierce, decisions need to be made quickly, and new products and features shipped remorselessly. Let\u2019s just say GitHub is only too aware of this reality. As is the mothership.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Just as accessing LLMs with chat to generate code has been supplanted by integrating LLMs directly into editors, there is another center of gravity which arguably makes even more sense for agents &#8211; that center of gravity being the GitHub workflow. Agents are asynchronous &#8211; developers can\u2019t be sitting around waiting for them to finish a job. The kind of long-running reasoning that agents provide is really anathema to developer flow, if you\u2019re in an editor or chat interface. But GitHub was built for the kind of asynchronous workflow we\u2019re seeing emerge. Agents and pull requests go together like peanut butter and chocolate.\u00a0<\/span><\/p>\n<p><a href=\"https:\/\/github.com\/newsroom\/press-releases\/coding-agent-for-github-copilot\"><span style=\"font-weight: 400;\">GitHub Introduces Coding Agent For GitHub Copilot<\/span><\/a><\/p>\n<blockquote><p><span style=\"font-weight: 400;\">The agent starts its work when you assign a GitHub issue to Copilot or ask it to start working from Copilot Chat in VS Code. As the agent works, it pushes commits to a draft pull request, and you can track it every step of the way through the agent session logs. Developers can give feedback and ask the agent to iterate through pull request reviews. <\/span><\/p>\n<p>The agent is expressly designed to preserve your existing security posture, with additional built-in features like branch protections and controlled internet access to ensure safe and policy-compliant development workflows. Plus, the agent\u2019s pull requests require human approval before any CI\/CD workflows are run, creating an extra protection control for the build and deployment environment.\u201d<\/p><\/blockquote>\n<p><span style=\"font-weight: 400;\">This human approval is essential. Vibe coding is all very well and good, but someone has to take responsibility for checking in the code. And indeed take the credit for doing so. Developers are the human in the loop, which is just as it should be. Software engineering in general, and enterprise software engineering in particular, is really not amenable to You Only Live Once. <\/span><span style=\"font-weight: 400;\">So GitHub and Microsoft are positioning the engineer and engineering team as the point of control and quality, as much as curation and overall decision-making.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">From an industry context perspective it\u2019s worth mentioning Jules &#8220;an asynchronus coding agent&#8221;, which Google launched the same week. It\u2019s another autonomous agent platform, designed to fit into the GitHub workflow. As Google <\/span><a href=\"https:\/\/blog.google\/technology\/google-labs\/jules\/\"><span style=\"font-weight: 400;\">stressed<\/span><\/a><span style=\"font-weight: 400;\">:<\/span><\/p>\n<blockquote><p><span style=\"font-weight: 400;\">GitHub integration: Jules works where you already do, directly inside your GitHub workflow. No context-switching, no extra setup.<\/span><\/p><\/blockquote>\n<p><span style=\"font-weight: 400;\">So deep GitHub integration is the new hotness for coding agents. Go where developers are, and enable asynchronous work.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">In a move that was both offensive and defensive GitHub also announced that it is open sourcing Copilot Chat in VS Code, shoring up Code\u2019s position as the modern developer\u2019s editor of choice. As Cursor and Windsurfer double down on integration and tightly packaged experiences, contributing the VS Code GitHub Copilot Chat extension makes it more appealing from an ecosystem perspective. Open source is still a very very useful lever to play in 2025 &#8211; using the permissive MIT license emphasizes the ecosystem. Microsoft and GitHub also made a commitment to integrate and open source further AI capabilities into VS Code core.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Talking of ecosystems, Coding Agent will also be rolled out for Jetbrains, Xcode and Eclipse.<\/span><\/p>\n<p><b>Beast mode Windows\u00a0<\/b><\/p>\n<p><span style=\"font-weight: 400;\">Let\u2019s use open source as the transition to the next section &#8211; where we examine Windows &#8211; specifically to see if Microsoft can make the platform more appealing to developers. Apple has owned the modern web developer for the last 15 years or so. Developers may kvetch about Apple Xcode, but they love the shiny, highly performant hardware and OS user experience, and Macbooks continue to dominate.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">So Microsoft needs to compete on the basis of software, hardware (performance matters!), dev tool integration, and flexibility. That\u2019s a lot.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">A long standing industry joke is that this is (finally) the year of Linux on the desktop. Ironically enough, Microsoft has been doing its best to build Linux support into the OS with Windows Services for Linux, allowing you to run Linux on your Windows machine without needing a virtual machine or dual-boot setup. The WSL experience has been steadily improving, since its introduction in 2016. 9 years later Microsoft has finally responded to the very<\/span><a href=\"https:\/\/github.com\/microsoft\/WSL\/issues\/1\"><span style=\"font-weight: 400;\"> first issue on the WSL repo<\/span><\/a><span style=\"font-weight: 400;\"> &#8211; to open source the code. WSL, which supports most popular distros including Arch Linux, doesn\u2019t suddenly become every Linux user&#8217;s tool of choice &#8211; but the target isn\u2019t actually Linux, it\u2019s OS X. The open source decision just removes one potential objection.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">If Microsoft really wants to turn the dial it needs to improve WSL performance and the out of the box experience for drivers and third party apps. The faster, snappier and less hassle it is, the more developers are likely to adopt it. Lag is one of the common complaints about WSL. But the open source move certainly won\u2019t hurt. That said, Apple isn\u2019t standing still &#8211; at its WWDC event it just announced a new Swift-based container runtime framework &#8211; enabling developers to create and run Linux container images directly on their Macs. It\u2019s vm-based, but Apple performance is such that\u2019s unlikely to be an issue. For a version one feature it looks pretty compelling.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Talking of performance Microsoft is clearly a long way behind Apple when it comes to silicon &#8211; Apple is ahead on both raw performance and energy consumption. Indeed, unlike Apple, Microsoft has to rely on traditional OEM partners for its microprocessors. In order to become more competitive it had to push ahead with support for ARM-based chips. It reached an important milestone in that respect last year with the launch of the Copilot+ form factor, and Microsoft\u2019s first Surface laptops based on Qualcomm Snapdragon X chips.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">The Copilot+ series was positioned as an AI-first architecture, for example including neural processing units (NPU), dedicated to AI and machine learning tasks, to best showcase and be performant. But optimising for NPUs was always going to be problematic in terms of getting developers excited about what they could do with a Windows machine. At launch Qualcomm got all of the attention because it was the basis of Microsoft\u2019s own machines. Yay to ARM laptops! But in the land of LLMs, GPUs are king. Requiring NPUs is all well and good &#8211; but if you don\u2019t do a great job of supporting Nvidia you\u2019re really not in the game.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Which brings us back to Build 2025 and one of the more interesting announcements &#8211; which played directly to the idea of developer choice. Microsoft announced <\/span><a href=\"https:\/\/learn.microsoft.com\/en-us\/windows\/ai\/overview\"><span style=\"font-weight: 400;\">Windows AI Foundry,<\/span><\/a><span style=\"font-weight: 400;\"> which through WindowsML, supports inferencing regardless of microprocessor, supporting AMD, Intel, Nvidia and Qualcomm, whether NPU, GPU or NPU. The platform also offers catalogs like Ollama and the Nvidia NIMs microservices packaging, as well as Microsoft\u2019s LORA and Phi Silica small language models. This could be a big deal. Just as with driver support in Windows back in the day, out of the box model and framework support across all hardware architectures is a classic Microsoft play.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Microsoft also announced the private developer preview of a framework for managing Model Context Protocol (MCP) access to Windows applications. It makes perfect sense to adopt MCP, &#8211; it has overnight become a de facto industry standard, but it also pays to be cautious &#8211; MCP lacks a strong security model, and Microsoft needs to be very careful about potentially opening Windows to LLM-based security exploits.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">One missed opportunity here was to show a beast mode gamer rig also used for running AI workloads. A lot of gamers choose Windows because of Xbox and hardware flexibility &#8211; there has to be a subset of folks that would love to upgrade their machine to use for both gaming and LLMs. A single machine for vibe gaming and vibe coding. That remains an open marketing opportunity.\u00a0<\/span><\/p>\n<p><b>Low Code is finally a thing<\/b><\/p>\n<p><span style=\"font-weight: 400;\">Not everything is about hardcore developers though &#8211; the Microsoft Power Platform also got its moment in the sun. Charles Lamanna, CVP of Business and Copilots at Microsoft always makes a compelling case, but the infusion of AI agents into the platform with Copilot Studio has the potential to finally deliver on the promises of low code. As a long term low-code skeptic I am increasingly coming round to the idea that AI is the underlying technology that will make a ton of new use cases possible for a broad range of users. Salesforce is in a similar position with its Agentforce platform &#8211; Salesforce admins and developers are going to build a ton of cool extensions for Salesforce customers. Salesforce has a deeply engaged global community, folks with business domain experience, and they&#8217;re ready to be unleashed with Agentforce if the user experience is right. The salesforce platform is increasingly horizontal, not being all about CRM, and Agentforce is a marker for that.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">While some in the industry argue that AI will remove the need for packaged applications &#8211; just have a solid data foundation and a bunch of agents instead! &#8211; for companies with solid enterprise application customer bases, and a low-code\/AI play, there is going to be plenty of opportunity for upside. And if there\u2019s a question about whether AI will replace low-code, Power Platform provides an answer and the answer is no &#8211; the centers of gravity will come together and package agents up for business users.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">So yes I think we can expect significant further growth from Power Platform &#8211; in a business that already has a lot of momentum picking up new enterprise logos. Microsoft did a solid job of showcasing Copilot Studio. We\u2019re obviously not going to see everyone learning to code just because code assistants are out there; the drag and drop application development experience, augmented with natural language commands and prompts, makes a great deal of sense in the AI agent era.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">So that\u2019s Microsoft low code business, but what about infrastructure?<\/span><\/p>\n<p><b>A quiet year for Azure<\/b><\/p>\n<p><span style=\"font-weight: 400;\">Azure\u2026 was not the belle of the ball at Build this year. Cloud infrastructure was somewhat relegated at Build 2025, which not to say there were no announcements, but mostly Azure was simply the place to run all the apps built with the new tooling.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">One bit of news that did catch my eye was <\/span><a href=\"https:\/\/learn.microsoft.com\/en-us\/azure\/ai-foundry\/what-is-azure-ai-foundry\"><span style=\"font-weight: 400;\">Azure AI Foundry<\/span><\/a><span style=\"font-weight: 400;\">. Model management and guardrails will be a core part of any infrastructure cloud, but what really struck me was that Microsoft is effectively describing what I call Progressive Delivery as a core AI pattern (our book on the subject will be <\/span><a href=\"https:\/\/itrevolution.com\/product\/progressive-delivery\/\"><span style=\"font-weight: 400;\">published in November<\/span><\/a><span style=\"font-weight: 400;\">)<\/span><\/p>\n<blockquote><p><span style=\"font-weight: 400;\">Azure AI Foundry will offer a new Model Router, in preview, which will automatically select the best OpenAI model for prompts, leading to higher quality and lower cost outputs. Additionally, automated evaluation, A\/B experimentation and tracing in Foundry Observability will support rollback to proven models if new ones underperform, enabling developers to stay on the cutting-edge of model capabilities to deliver cost-effective solutions.<\/span><\/p><\/blockquote>\n<p><span style=\"font-weight: 400;\">Stay on the cutting edge, but manage the blast radius.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Scott Guthrie executive vice president of the Microsoft Cloud + AI Group, in his keynote slot on day two positioned CosmosDB as an underlying service that has helped propel OpenAI forward. OpenAI is taking advantage of Microsoft managed data services, not just infrastructure as a service.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">He also talked up Microsoft\u2019s green credentials, one of few times sustainability was mentioned at the show. It was good to see (at least some) lip service paid to sustainability there, when other major vendors seem to be jettisoning their commitments.<\/span><\/p>\n<p><b>Multimodel is not a position of strength<\/b><\/p>\n<p><span style=\"font-weight: 400;\">Finally &#8211; some thoughts on Large Language Models and industry ecosystems. Here is where the news is not quite so good for Microsoft. Betting on \u201cmulti model\u201d is not the winning position.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">At Build this reality was brought into stark relief in the \u201cCEOs of competitors\u201d section of Nadella\u2019s keynote. Sam Altman of course appeared in a Zoom interview, this just a couple of weeks after rumours emerged that OpenAI might be acquiring Windsurf. Whether or not OpenAI does so, with its coming push into consumer devices, led by Jonny Ive, it\u2019s going to be competing directly with both Apple and Microsoft.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">After Sam Altman we had a recorded appearance with Elon Musk, because Microsoft was announcing support for the xAI\u2019s Grok model. Whatever you think of Musk, and clearly he has his boosters, he\u2019s a deeply divisive figure, and there were definitely people in the room and online who were not happy to see him featured. Nadella\u2019s values and those of Musk would not seem to be terribly well aligned.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">If Microsoft is forced to talk up the benefits of multi-model and provide platforms for third parties building frontier models, that\u2019s arguably a problem for the company. Essentially all of the developer-facing tools described above are either running OpenAI or Anthropic.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Compare and contrast with Google, which is now puffing out its chest at its own events, trumpeting the capabilities of Gemini. Sure it supports multimodel in Google Cloud, but it can happily lead with Gemini in demos and its own code assist products and agents.\u00a0\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Microsoft on the other hand is not in charge of its own destiny when it comes to frontier models, which is ironic for the company that is now marketing the idea of Frontier Firms. <\/span><a href=\"https:\/\/www.microsoft.com\/en-us\/worklab\/work-trend-index\/2025-the-year-the-frontier-firm-is-born\"><span style=\"font-weight: 400;\">According to Microsoft research<\/span><\/a><span style=\"font-weight: 400;\">:<\/span><\/p>\n<blockquote><p><span style=\"font-weight: 400;\">We are entering a new reality\u2014one in which AI can reason and solve problems in remarkable ways. This intelligence on tap will rewrite the rules of business and transform knowledge work as we know it. Organizations today must navigate the challenge of preparing for an AI-enhanced future, where AI agents will gain increasing levels of capability over time that humans will need to harness as they redesign their business. Human ambition, creativity, and ingenuity will continue to create new economic value and opportunity as we redefine work and workflows.<\/span><\/p>\n<p>As a result, a new organizational blueprint is emerging, one that blends machine intelligence with human judgment, building systems that are AI-operated but human-led. Like the Industrial Revolution and the internet era, this transformation will take decades to reach its full promise and involve broad technological, societal, and economic change.<\/p><\/blockquote>\n<p><span style=\"font-weight: 400;\">Well in that case it\u2019s probably a good idea to own one of the most powerful AI and LLM companies. Models may be commoditising but that doesn\u2019t mean you don\u2019t want to own one. If that game moves forward to winning customers with tokens, then you don\u2019t want to be relying on a third party. And it\u2019s increasingly a token-based world.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">In closing &#8211; Microsoft is in excellent shape, with some great assets, but in my view it either needs to embark on a moonshot to build or foreground its own models, or it should make an era defining acquisition of Anthropic, which offers the best experience for code. Anthropic would be expensive &#8211; certainly north of the $61.5bn valuation of its last funding round &#8211; but Wall Street trusts Nadella, and the stakes are absurdly high. Rumours currently abound that OpenAI is trying to change the terms of its contract with Microsoft, and that OpenAI is under-cutting Microsoft in Copilot sales deals. I don\u2019t really see how this will end well. This last paragraph deserves a post in its own right, so that\u2019s what you\u2019re going to get. I will be following up shortly.\u00a0\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Disclosure: GitHub, Microsoft, Google Cloud, and Salesforce are all clients.\u00a0<\/span><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Microsoft did an excellent job at its Build conference last month, showcasing its strengths and ambitions for the AI era. It\u2019s always hard to summarise Build, because the company has such a long history and a number of different developer and ops centers of gravity &#8211; including Microsoft Azure (cloud deployment and management), Developer Division<\/p>\n","protected":false},"author":5,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"spay_email":"","footnotes":"","jetpack_publicize_message":"","jetpack_is_tweetstorm":false},"categories":[1],"tags":[],"class_list":["post-5354","post","type-post","status-publish","format-standard","hentry","category-uncategorized"],"jetpack_featured_media_url":"","jetpack_publicize_connections":[],"jetpack_sharing_enabled":true,"jetpack_shortlink":"https:\/\/wp.me\/p9wfjh-1om","_links":{"self":[{"href":"https:\/\/redmonk.com\/jgovernor\/wp-json\/wp\/v2\/posts\/5354","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/redmonk.com\/jgovernor\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/redmonk.com\/jgovernor\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/redmonk.com\/jgovernor\/wp-json\/wp\/v2\/users\/5"}],"replies":[{"embeddable":true,"href":"https:\/\/redmonk.com\/jgovernor\/wp-json\/wp\/v2\/comments?post=5354"}],"version-history":[{"count":0,"href":"https:\/\/redmonk.com\/jgovernor\/wp-json\/wp\/v2\/posts\/5354\/revisions"}],"wp:attachment":[{"href":"https:\/\/redmonk.com\/jgovernor\/wp-json\/wp\/v2\/media?parent=5354"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/redmonk.com\/jgovernor\/wp-json\/wp\/v2\/categories?post=5354"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/redmonk.com\/jgovernor\/wp-json\/wp\/v2\/tags?post=5354"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}