High Activity
25
I Use This!

News

Analyzed about 3 hours ago. based on code collected about 2 months ago.
Posted almost 4 years ago by Erik van Ballegoij
Creating a good module makes it sometimes very hard to improve upon that. The (still current) old release of the DotNetNuke Announcements module was such a module. Created in an era long forgotten, as a DotNetNuke 4 module, it was quite good at the ... [More] time when it was released. The announcement module is a fairly simple module that is a useful tool on most websites, while at the same time showcasing important DotNetNuke API features, and thus playing also a role in helping developers understand DotNetNuke module development. I feel pretty bad when I look at the last release date of the module though. May 15, 2009. More than 3 and a half years ago. The module might have been a good module back than, but it certainly is not anymore right now. What was regarded as using best practices back then, is currently not much more than using legacy code. So… it is time to fix this with a new and shiny release, ready for the DotNetNuke 7 era that is upon us!. Let’s review the plans for this new version of the module: DotNetNuke UI The old module doesn’t look very nice anymore in the current DotNetNuke versions. Of course, in this reincarnation the DotNetNuke form pattern, as explained in the UX Guide is implemented. This is a screen shot of the Edit Announcement popup: Rewrite to C# Some people in the DotNetNuke community used to think that the VB my nickname ErikVB was short for VisualBasic. It is of course nothing more than a coincidence that the VB happen to be the first letters of my last name. The switch to to C# may come as a surprise to some, but really it doesn’t matter that much. As it currently stands, there seems to be a movement in the direction of using C#. In my mind, one of the raisons d’être for this module is to be a good sample module for starting DotNetNuke developers. As for the process of converting, I used Instance C# from Tangible Software. This automated conversion tool works pretty well, the only area of the project it had trouble with was the DAL (specifically the SqlDataprovider class). Of course that was not a huge problem, since the old DAL code was slated to be removed in the first place. Implement DAL 2 One of the very nice new features is the introduction of the DAL 2. In a series of blog posts, starting with this one, Charles Nurse, Chief Architect of DotNetNuke Corp., outlines how the the face of Data Access is changing with DotNetnuke 7. We are implementing a Micro-ORM called PetaPoco, which provides us with a number of advantages. The bottom line of that is that it has become a lot less work for developers to access the DotNetNuke database. No more writing of StoredProcedures (well, to be fair, you can still do that if you need that), which apart from being faster, is also good for making less mistakes. In my experience, a lot of errors in modules actually originate in issues with missing ObjectQualifiers in the SQL scripts. The new DAL 2 automatically factors in the ObjectQualifier of the installation, so that leaves one less thing for the developer to think about. In the new Announcements module, the AnnouncementInfo class now looks like this: And in the AnnouncementsController class, the AddAnnouncement method changed to this: I must say, this code is much cleaner than under DAL 1, and on top of that, as a bonus, we now get top notch caching of our business objects, by just marking them as being Cacheable. Implement DotNetNuke WebAPI Of course another very cool new feature of DotNetNuke 7 is the support for ASP.NET Web API. Many things have been written about it, and it is good to see that a lot of developers in the community are jumping on this new feature. The new Announcements module will use the new WebAPI for a couple of different things: templating (think KnockoutJS), being able to generate an XML view on the module content, RSS/ATOM support etc. Of course WebAPI supports automatic output formatting based on what the client requests. By default this means that the output is either XML or JSON, however, I was particularly interested in the option to extend the output formats with your own. My idea is to offer an alternative RSS feed for the module, that allows me to do something special: offer a combined feed of all the modules in a DotNetNuke site, that is filtered to the ones that you are actually allowed to see. I came across an interesting blog by Filip Woj, dealing about creating a RSS / Atom MediaTypeFormatter for WebAPI. Implementing this is very simple, and the beauty of it is that with the same code we can now also serve RSS and Atom feeds. The first step in adding support for WebAPI starts with adding at least one route. In my case, i wanted to include an option to force the output in a certain format in the route, which let to this: and to this implementation for GetCurrentAnnouncements: As you can see, the ActionName is defined as “Current”, which means that, according to the defined route, the URL for this action is: /DesktopModules/Announcements/API/Current and optionally for instance this: /DesktopModules/Announcements/API/Current/rss The default value for the Output parameter is “default”, which means that in that case the auto formatting feature of WebAPI is used. All methods that return results based on a list of AnnouncementInfo objects, will use a single GenerateOutput method to send the output to the client, which means that all these methods will share in the goodness. GenerateOutput looks like this:   The future All of this is currently available in a CTP release that was released today. As much as I had wanted, the module is far from ready. The work that has been done so far is, apart from the UX upgrade, mainly in the invisible parts of the module. I feel this is the right direction to go, as the module really needed a new, strong, foundation. The next major enhancement will be the templating system. As it stands now, I am strongly considering adding support for multiple different templating systems: DotNetNuke TokenReplace KnockoutJS Razor XSL Other than the current system, the new templating will be file based.. which will make some skinners out there a little bit more happy I would guess. Other than that, I would say the module also needs support for the DotNetNuke Taxonomy system, and also for the Social API… but more about that later… For now, take a look at the CTP for the new Announcement module for DotNetNuke 7, available here: http://dnnannouncements.codeplex.com/releases/view/98256. Enjoy, but please do not use it in a production site yet! If you find any issues, please log them in the issue tracker on CodePlex.   (This is a repost from my personal blog)More ... [Less]
Posted almost 4 years ago by Shaun Walker
For the third year in a row,  DotNetNuke Corporation had achieved the distinction of being recognized by Business in Vancouver (BIV) as one of the Top 100 Fastest-Growing Companies in B.C. for 2012! This year we improved our ranking by 3 positions ... [More] to come in at #14 overall, with a 4 year revenue growth rate of 892%! This distinction places us in the Top 5 fastest growing Technology firms in the province. We congratulate all of the other businesses who made the list for 2012.More ... [Less]
Posted almost 4 years ago by Shaun Walker
This is the first year that CMS Critic has conducted their own Critic’s Choice Awards for Content Management Systems. We are proud to announce that based on popular vote, DotNetNuke has received a Peoples Choice Award for 2012, taking first place ... [More] in one of the Best CMS categories. DotNetNuke edged out Liferay for the win and we really appreciate the support and enthusiasm demonstrated by the DotNetNuke community, as it really highlighted the strength and passion of our ecosystem. The Critics Choice Awards will be announced on December 1st and we are hopeful that excitement and anticipation for the upcoming DotNetNuke 7.0 release will factor into the judge’s decision.More ... [Less]
Posted almost 4 years ago by Shaun Walker
The results of Visual Studio Magazine’s 20th annual Readers Choice Awards were recently published and we are proud to announce that DotNetNuke has achieved Silver recognition in the Web Design and Development Tools category in 2012. This achievement ... [More] was based on feedback by Visual Studio Magazine readers who were invited to vote for their top picks from a list of more than 500 eligible development tools in 28 categories, all designed to increase developer productivity or bridge gaps in the Microsoft stack. For the second year in a row DotNetNuke took second spot in the category, just behind Telerik’s Kendo UI client-side web control suite. Adobe’s Creative Suite Master Collection fell from first place in 2011 to third place in 2012. It is great to be recognized as one of the most valuable tools for Microsoft developers; especially amongst such esteemed company. We appreciate your support.   More ... [Less]
Posted almost 4 years ago by Peter Donker
In this blog post I intend to explore the potential of DNN Social. This is an umbrella term for the bits and pieces in the framework that allow users to interact with each other reminiscent of Facebook. It includes modules/APIs called “messaging” ... [More] , “journal” and “groups” among others. Most of DNN Social can be traced back to Active Social, a module developed by a company called Active Modules, that was acquired by DNN Corp in early 2011. Much has been made of DotNetNuke’s new “social” capabilities, since the first release of these in version 6.2. And quite rightly so. The deep integration of what used to be a set of third party components into the framework have now given us a solution that has a great deal of potential. DNN Corp, of course, are the makers of the platform and make their money selling a professional version to bigger, more demanding clients. Now, the acquisition of Active Modules and the subsequent embedding of their “Active Social” product into DNN led to some head scratching in the community as Active Social could be perceived as a typical “community” component (and happily it was rolled out in the Community Edition of the platform). How does that fit with “business”? In the keynote address of Navin Nagiah at DNN World 2012, I got a taste for how this is being positioned. Announcing “Nebula”, the more social bits for Q1 2013, the message was “this will help you break down barriers between your organization and your customers”. And that’s a good point. Who wouldn’t want to do that? I’m pretty sure the solution will fit nicely into the C2C and B2C communication niche. But it struck me that this is not what I’d have put at the forefront. Partly because I felt it addressed a limited segment of the product making industry (would Exxon look for interactions with customers on how their fuel is used?). But more importantly because I felt that somehow an even bigger goal was left unmentioned. The focus on B2C communication is certainly one way of looking at the value of DNN Social. Understand your customers and learn how to increase the value you can deliver to them. But I think there is more to be gained from DNN Social for business. Much more. And with real value. But allow me to digress into the nebulous realm of “knowledge management” before I get there. A field I got familiar with in the early 2000s when I used to work at the Telematica Instituut (now Novay) in Enschede, Holland. This is a research institute founded by the Dutch government with the aim of spurring ICT research in Dutch business. The institute develops and leads projects that combine private enterprise with universities to stimulate knowledge sharing and spurn innovation. Specifically I have been involved in the fields of Computer Supported Collaborative Work (CSCW) and Knowledge Management (KM). Knowledge Management The field of knowledge management came to the fore in the early nineties. In a seminal book, Japanese authors Nonaka and Takeuchi illustrated how, in their view, knowledge was fostered and transferred in Japanese industry. They based their ideas of the concept of “Tacit” knowledge by Hungarian philosopher Michael Polanyi, who claimed that some knowledge doesn’t lend itself to be verbalized and all our explicit (i.e. easily expressed) knowledge is grounded in this tacit knowledge. Nonaka and Takeuchi used a quadrant to show how knowledge was made explicit and turned implicit again in an ever on-going spiral of knowledge creation: No publication on knowledge management is complete without this illustration. It underlies most thinking in this field. Nonaka and Takeuchi were addressing a business audience at the time. The central tenet of their publication: knowledge is as important a production factor as capital and brute labour (i.e. muscle) and should be managed as such. Well, this led to a whole cottage industry around this theme. Interestingly enough (and that which has always fascinated me in this field), this seemed to play out on the edge of technology and cognitive science. And the two sides don’t always play nice together. In my time in this field the “hard” technologists would accuse the “soft” social scientists of waffling while not bringing concrete (read: measurable) results and the softies would argue that the nerds were missing the point entirely. Anyway … that’s behind me now. Suffice to say that “just throwing technology” at a problem is generally not the way to get results. But equally that technology has a significant role to play in improving the sharing of knowledge. We are now well over 20 years into this field and I’m sure Nonaka and Takeuchi (or any of us for that matter) could not have dreamt of how the world has evolved since that time. What has been dubbed as “Web 2.0” has given researchers in knowledge management a very exciting decade. From blogging to microblogging, from Twitter to Facebook, from web to mobile. We have had a very exciting ride. And the doomsayers who believed we’d interact less with each other socially as the web began to consume our lives, have been firmly disproved. We’ve augmented our life with social media, not replaced it. And researchers have been having a field day with the data to come out of this. Back when I was working in this field, for instance, fellow researchers were looking at how blogging was transforming the way we work. Obviously, auto-publishing was having an impact not only in our private lives, but equally in corporate knowledge management. And it sent shivers down the spines of top management who now feared all the company’s secrets would be blogged, tweeted, etc. Which bring me to the point of secrecy. Although an organization has a vested interest in having employees interacting and sharing knowledge, they have a natural tendency to be wary of any technology that may facilitate this for fear of security breaches or other liabilities. There is a trade-off between secrecy and the free flow of knowledge. And companies that take a hard line on information security (e.g. sharing on “need-to-know” basis) tend to suffer in terms of knowledge sharing. It’s not rocket science. But let’s not be dogmatic about this. Neither one must completely trump the other. There must be some attainable balance in this. And this is where I think the new DNN has a role to play. Peripheral awareness Back to technology (after all, we are here in the business of creating a kick-ass web platform). Microsoft and others have recognized the “social” trend, but as so often they’re fashionably late to the party. Recently, they’ve bought Skype and they’re hammering away at the next version of their “Lync” corporate communication platform. But despite all the new toys, people will probably remain glued to Blogger, Twitter and Facebook. And not just for personal stuff. You don’t really think that we all just tweet about what we’re eating today, right? A lot of us publicise what we’re working on on a daily basis. And we walk a fine line between our desire to share and our boss’ desire for us to “shut up and get on with it”. I often see posts along the lines of “working on an exciting new site for a fortune 500 company”. Intentionally vague. But giving me some “peripheral awareness” of what this person is up to. Peripheral awareness is knowing what your colleagues/friends/etc are doing without specifically paying attention. It is like knowing that it’s raining without having stared outside and thought: “hey, it’s raining”. Similarly we consume a huge number of tweets each day which we don’t internalize. Instead we filter them to look for what we find interesting. But we may have glanced over that tweet of that friend about how awesome Rome is. So he’s in Rome. On holiday? Business? Whatever. We move on. But when at night the news has a segment on riots in Rome in reaction to the government’s new austerity measures, we remember our friend is there. Is this important? Maybe. If the friend is a colleague we may realize she is out of the office. Or that he is visiting some conference he tweeted about a Month earlier. The fact is we store a lot of stuff and it gets combined at the most creative moments. Social media (microblogging, Foursquare, etc) augment our peripheral awareness. Like having a Head Up Display in our head, it adds to our understanding of our world. Let’s look at an example. One of my most treasured clients is an architectural firm. An internationally acclaimed office employing between 50 and 100. Big enough for an intranet to have some real benefits. Architecture is a very interesting business. It combines a multitude of disciplines that work toward a narrowly defined, common goal. In larger firms like this, you’ll typically see specialists working alongside generalists to make this happen. Within the walls of the office you can have specialists such as structural engineers, interior designers, etc. But also eco-friendly building experts and other, more fringe type disciplines. Here I wish to highlight the example of a guy who knows everything about materials. He knows that if you want a roof tile with specifications x, you’d better have a look at this or that company’s product line. And he keeps himself informed by keeping in tune with what’s coming out. These people are invaluable in the organization. Once settled into their niche a certain pride and geekiness kicks in. They become the go-to person. One of his wishes was to not only be able to record his discoveries, but also the ability to share what he knew for the whole organization to see. He now uses a blog to write about new products coming out. What I want to illustrate is that the blog, although not every member of the office may read everything he has to say, is part of the peripheral awareness of the whole office. People know who to ask and may even be inspired by one of his new discoveries and use it in their project. The blog is the oil in the machine. It spurns creativity and innovation. But we could do even better, in my opinion, by providing a more interactive layer over this. And this is what I think DNN is now bringing forward with DNN Social. The ability to form a group could be used to create a group around materials. The journal of this group would contain all interactions within this group as well as activity around the blog. The comments of the blog can now hook into the general journal mechanism and others will have a peripheral awareness of activity in the blog. And who knows what exciting new module comes next that leverages this API. Communities of practice Another important aspect of social media is that it gives us a sense of belonging. The interactions build a sense of community. This brings me to the next important aspect of the social bits: grouping. As sponges we (try to) absorb what our devices (laptops, tablets, phones) pump toward us. In so doing the source of the information items is significant. Is it family? Work? One of my running buddies? Some crazed lunatic who’ll say anything to get attention? And we group these sources. It allows us to more easily channel information and to properly attach value. Back in the early days of knowledge management, Swiss scientist Etienne Wenger came up with the notion of “Communities of Practice” (or CoP in short). These are groups with a common interest that have either been created or have evolved naturally and where people share information and experiences and have an opportunity to develop personally and professionally. Our DNN community is an example of such a CoP. Note that CoPs are not about tools. It is about group learning. But certain technical tools have greatly propelled some of these groups forward. We could imagine the materials guy from the earlier example being part of a CoP that is not part of the office, but lives outside it. The point is: we think in groups when dealing with the daily information deluge. And for a tool to be useful it must be able to grasp that concept (e.g. why does Skype not offer me the ability to group contacts and control my visibility to each?). One thing I’d look for in the example above, is for the materials expertise group to be able to hook into a wider CoP. An idea for a new module, maybe? An exciting opportunity So now to the meat of my argument. The new social bits of DNN are the seeds of a beautiful and bright future for the framework as we can now bring this as a knowledge management tool set to our clients. The ability to create communities through groups, the ability to offer microblogging in an enclosed environment, these are valuable assets for someone who is looking for ways to get people interacting and sharing knowledge. Obviously you need to keep in mind that it’s not sufficient to just put it out there. You need a clear strategy to get where you want to go. Help setting up groups. Maybe opening some of it up to the external world. And probably some non-technical interventions (rewarding those that contribute for instance). But keep this in mind: the sky’s the limit as we have a very extensible framework and you can tweak this any way you want. Facebook has apps. But we have a complete Facebook app in our hands! More reading Nonaka, I. and Takeuchi, H. (1995) The Knowledge-Creating Company: How Japanese Companies Create the Dynamics of Innovation. Oxford University Press Other publications of Novay on Knowledge Management: https://doc.novay.nl/dsweb/View/Collection-7770 Blogs by friends of mine on Knowledge Management: Croeso and Mathemagenic An example of the tension between security and knowledge sharing at Nasa: Balancing Security and Knowledge Sharing (PDF) The links in the text point mostly to Wikipedia articles. Those links are also great starting places for exploring the field of Knowledge Management.More ... [Less]
Posted almost 4 years ago by Erik van Ballegoij
A long time ago I blogged about the caching enhancements in DotNetNuke 5, and a recent blog post by Mitchel Sellers about using DotNetNuke Caching in Custom Modules triggered me into revisiting my old post. In his excellent post, Mitchel describes a ... [More] way to abstract from some of the available methods in the DotNetNuke .Common.Utilities.Datacache class. This is indeed a great way to simplify using the DotNetNuke entity cache in your own modules. One thing that is not so great about those base methods is that they are not geared to be used in high traffic applications. The DataCache.GetCache method for instance is in itself not thread safe. Remember that in the “old” days of DotNetNuke 4.9, quite a few sites had caching issues because of this. Another reason to use this caching pattern is that it allows you to also set the cache timeout and cache priority very easily. The Caching pattern that was introduced in DotNetNuke 5 has since been enhanced and optimized for performance and thread safety. One of the enhancements is that there is now also support for cached dictionaries. In rereading my old blog post, i realized that it was still not very clear how to use that pattern in your own code.. it lacked a simple example. So in the rebound, I will point to another, simpler, example in the Core, and also show an example of how this caching pattern that will be used in the new version of the Announcements module. Let’s first look at the method that the core uses to retrieve all pages for a portal, DotNetNuke.Entities.Tabs.TabController.GetTabsByPortal: And the corresponding callback method: As you can see, this is also extremely simple to use. In the backend, CBO.GetCachedObject calls DataCache.GetCachedData, which you could call directly yourself from your code. However, the way CBO (which stands for Custom Business Objects) is architected, it is DotNetNuke’s own object factory. The class has all kinds of handy methods to help create your objects. So using that same object factory to retrieve objects from cache makes sense. In the upcoming version of the Announcements module, caching will be one of the things that will be focused on, especially since the current version has some funky caching issues. The new version of the module has to keep track of the URL of a single item in the announcement module, in order to be able to generate a proper ATOM feed (the module will support both RSS and ATOM feeds). One option was to store that URL in the database, but that would cause issues when a module is moved to a different page. I decided to make this a calculated field, however, since that calculation is not very efficient, the result of the calculation needs to be cached.   In this case, the PermalinkCallback method is only ever called if the data is not in cache, in all other cases, the value is loaded in a thread safe manner from the data cache. If your custom code does not use caching, then it would be a very good idea to start experimenting with it. DotNetNuke makes it very easy for you to start using caching, and does a pretty good job abstracting from the inner workings of a good caching pattern. This post is cross posted from my personal blog.More ... [Less]
Posted almost 4 years ago by Shaun Walker
Most people are familiar with the sayings “Great Minds Think Alike” and “Imitation Is The Sincerest Form Of Flattery”. Well, this past week at the SharePoint Conference in Las Vegas, Nevada, Microsoft communicated to the market that the future of ... [More] SharePoint, its widely deployed ( and often criticized ) internal collaboration and document management system, would focus on three key industry trends: Cloud, Mobile, and Social. This is a very familiar story for us at DotNetNuke Corporation, as it was 2 years ago when we first predicted these trends would be the three most disruptive and relevant technology innovations for the web software industry. In the years since, DotNetNuke has made significant progress in embracing these industry trends and integrating them into our products and services in a very deep and rich manner. This focus enabled us to deliver compelling solutions to customers which, in turn, allowed them to leverage these same trends within their specific business environments to achieve greater efficiency, productivity, and innovation. Our product vision also allowed us to demonstrate thought leadership in the market, resulting in opportunities to share our message with the world – a great example being the keynote I delivered at the CMS Expo conference in Chicago earlier this year. It appears that Microsoft has now also taken notice of these industry trends and we are excited at the opportunities this may provide; especially in terms of our ability to leverage our investments in our SharePoint Connector to create even richer integrations with SharePoint to satisfy customers business goals. ( Keynote address being delivered by Jared Spataro, Senior Director SharePoint Product Management at Microsoft Corporation, at the SharePoint Conference 2012 in Las Vegas )More ... [Less]
Posted almost 4 years ago by Will Strohl
It may surprise you to hear that in such a populated region such as London to hear that there has never been a DotNetNuke user group there.  That’s right!  There isn’t an area in the United Kingdom that is considered to be more metropolitan.  It’s ... [More] also considered to be a leading global city in numerous areas such as arts, commerce, education, entertainment, fashion, finance, healthcare, media, research and development, and more.  Yet DNN hasn’t reached their techie scene.  That’s changing!  The very first user group meeting that focuses on DNN will be held there in just a couple of short weeks. It is with great pleasure that I can yet again introduce you to a brand new DNN user group.  LonDNN is led by John J. Royle of Glanton Corporate Web Services – which is currently a Gold DotNetNuke Partner.  Their first meeting is on Wednesday, December 5, 2012 – or 05http://www.dotnetnuke.com/12http://www.dotnetnuke.com/2012 for you locals.  Hehehe… What Can You Expect? First, don’t forget that this is the kick-off meeting.  This usually consists of a lot more networking than anything else, and also helping to form what the user group will be moving forward.  However, you can thank awesome sponsors like PowerDNN and Novus Leisure Ltd. for making sure that you will have a great venue, refreshments, and prizes at the very first meeting!  And of course thanks goes to Glanton Corporate Web Services since they are helping to sponsor the coordination and promotion of the monthly events.  Did you know they are sponsoring the S.E. Queensland DNN User Group too? Many user groups have to work months or years before you have a chance to meet notable community members.  This first meeting already has people like James Roswell of 51Degrees (the software behind the awesome DNN mobile redirection features), Harvey Kandola & Saar Cohen of Gemini (the software that runs the DNN bug tracker), and Salar Golastanian.  You may have seen or heard of Salar from any number of community and ecosystem contributions through Salaro.  I’ve looked at the attendee list personally and saw quite a few familiar faces there, but there’s too many to mention.  Sorry, gents.  :) On a final note for what to expect…  I heard through the grapevine that there is some really nice DNN swag on the way to the event.  If you want some of that, you have to show up! Where and When? The first monthly meeting of LonDNN will be held at the Piccadilly Institute, in the Shrink room.  All details and registration can be found below: Target Audience:  All types of DNN users, even if you’re new to DNN When:  December 5, 2012 @ 6:30 PM – 8:30 PM Where:  Piccadilly Institute, in the Shrink Room Address:  The London Pavillion 1 Piccadilly Place, London Register to Attend This blog entry is cross-posted from my personal blog site.More ... [Less]
Posted almost 4 years ago by Israel Martinez
We are pleased to announce the release of DotNetNuke version 6.2.5. This is a minor release with security fixes and a select number of high priority issues We encourage all users of earlier versions to upgrade ... [More] to this latest release. And please continue supporting our quality efforts by logging any issues you may find in Gemini and let us know if an issue is closed without a satisfactory solution. Below is a summary of the changes for this release. For more information about a specific issue please refer to the official change log. Major Highlights Enhanced the page settings functionality to allow the user to specify link behaviour like existing window or new window Fixed issue where invalid subdirectories are created under App_Code when mapping the the DesktopModules folder structure for dynamic modules Updated the friendly error page to also display the actual HTTP error code Fixed error in the WebRequestCaching Provider Fixed issue where pages that use caching would not transmit a Content-Type value in the http response header Fixed exception when publishing content using Content Staging More ... [Less]
Posted almost 4 years ago by Peter Donker
As most of you are aware, DotNetNuke has rolled out with 5 extra languages over the past year: German, Spanish, French, Italian and Dutch. These translations are done by trusted partners under contract so DNN Corp can assure their quality and them ... [More] being available on time when a new version of DNN is released. This may have gone largely unnoticed in places like the US, Canada or Australia. But it has made quite an impact in the mentioned language zones. And things will get even better. The above is just one (visible) step that is being taken to make the platform more accessible to those in the non-English speaking regions. In the future we aim to open up other languages in a much more fluid way than is currently the case. But one step at the time. As part of our effort to get there, a new tool has been developed to help translators speed up the translation process. This tool is called the “DNN Translator” and is available on CodePlex at https://dnntranslator.codeplex.com/ Why this new tool? As you may know the texts that are used in DotNetNuke’s code are stored in so-called resource files. These are XML files with the .resx extension that you find in “App_LocalResources” folders all over the place. Localized (i.e. translated) versions of said files override the original texts when a page is requested in this other language. You’ll spot the localized versions as for instance AddModule.ascx.nl-NL.resx. This would be the Dutch translation of texts used in the AddModule control. DotNetNuke includes a module to help users edit these resource files. You can opt to edit the original (i.e. for en-US), a specific language or a text specific for the portal. Although this tool is fine for the occasional translator, we found translators found it also a cumbersome tool to work with. And flipping back and forth between the editor and the result is also a suboptimal experience. Thinking outside the box we came up with the idea of an “offline” tool. I.e. a resource translation tool that works on an installation locally on your hard drive. All translators tend to work with a translation on their own machine, so there was no strict reason to have the tool online (i.e. as a module). Having it offline opens up the door to some interesting possibilities and in general makes the tool more responsive (don’t need to go through IIS). This tool became the DNN Translator. Who should get excited about this? The target audience for this tool are translators who maintain language packs. The occasional translation or portal text override are still perfectly possible with the built in editor. But if you are translating the whole framework you have hundreds of files to get through, and you’re better off with this tool. So translators should use this, (end/admin) users should use the Language Editor built into DNN. What are its features? What follows is a summary of features. This tool is in no way “done”. We foresee new features being added as the tool gains traction and we begin to get some feedback. But what follows is where we are at at roughly DNN 7 release time. A modern look and feel We intended to make the experience as pleasant as possible for translators and to make sure we were up to date in terms of UI standard practice. This is probably the hardest part of the application to do well. We are happy with the way it now presents itself to the user, but we remain open to suggestions. On screen editing as in the Language Editor The basics still remain. The translator sees side-by-side the original text and the translation. Clicking Save (or Ctrl-S) the resource files is instantly saved and changes can be seen in the application if you’re viewing it through IIS at that point. The focus in this bit on the UI is really productivity. Jumping from one field to the next easily. Highlighting empty fields or those where the text is equal to the original. Edit any piece of your installation It is important to realize that the tool is not solely meant for translators translating the core. It is equally valid for users that are responsible for translating (3rd party) modules. The application attempts to read from the site’s database which components are installed and brings this up in a dropdown. Selecting a component brings up its resource files in the tree in the main window. Support for Google and Bing translation services We find that most translators use Google or Bing to prime their translation first. Then they’ll go in and adjust those translations. The tool supports this scenario. Register your favourite translation service and leverage the translations they provide. You can have the translation service provide the translations in a separate column to avoid overriding your existing texts. This allows you to play without risk of destroying data. Note: both Google and Bing now require you to create an account with them to use these services. Bing has a threshold of free translations per Month for you. For more details consult Google and Microsoft. HTML text editor Regularly we store HTML in resource files. They are encoded to keep the file valid XML and show up like this: This is impractical for editors as they have to sift through the added blurbs the encoding adds to the field. Instead we can use an HTML editor to edit the field: This allows the translator to focus on the text. Note it is naturally not possible to include images as HTML only caters for links to images and links will not work. But the editor will allow you to use all the text styling necessary. Support for your own dictionary Some translators keep an internal list of frequently translated terms. “Cancel” and “download” are terms that you find often when translating. The Translator allows you to maintain your own dictionary and can prime fields (just like the translation fields) when it finds texts already in the dictionary. Note this works on whole phrases only. So “Do you wish to delete this item?” would go in the dictionary in its entirety. Save and compare to snapshots A snapshot is a freeze of all original texts. Use this to compare changes over versions of what you are translating. Let’s say you are translating DNN 6.2.3, save the snapshot as DNN623. Then, when 6.2.4 comes out you upgrade the installation you’re working on and run a compare against the DNN623 file. It will tell you what changes there have been (i.e. changed texts, deprecated texts and new texts) and allow you to quickly bring up these changes for editing. Connect to Localization Editor module This is the most exciting part for the translation managers (like myself). The Translator can hook into the Localization Editor module remotely. The Localization Editor is a module meant to help maintain language packs. It can monitor different modules, different versions thereof and different locales. And it is also available on CodePlex. The idea is that in the future DNN Corp will open up a central repository where translators can submit any pack with this tool (the DNN Translator leverages the new services framework in DNN 7 for this). So if you have a Hungarian translation for the Events module version 6.0.3, you could just upload it instantly and the translation would be available to others. Note: as this will be making use of the new and upcoming Web API in DNN 7 this part is still a bit in flux as we wait for the definitive release of DNN 7. But this is where the tool will be heading and how we envision its use in the DNN community in the future. Create language packs Finally, and most importantly, the Translator can generate language packs. In the program’s settings you can specify the various fields to include in the pack for the translator (name/license/etc). The packs conform to the DNN standard for packaging resource files. You can even generate “Full” packs, meaning packaging the translated resource files for both the framework and any installed module in one go. Getting started So how can you get started? Well, head over to Codeplex and download this tool and install it. There is a manual in the downloads which will guide you through the first steps.More ... [Less]