Tags : Browse Projects

Select a tag to browse associated projects and drill deeper into the tag cloud.

uniquemodel

Compare

  Analyzed about 1 year ago

OverviewUniqueModel is provides a mechanism for application developers to define a list of property tuples to be kept unique on the Google App Engine Datastore. See it in action (code) Noteworthy FeaturesSQL-like robust uniqueness enforcement Able to enforce uniqueness of multiple property ... [More] tuples for a single kind Uniqueness is enforced even on concurrent put() operations Virtually no programming overhead for normal usage (see example below) Support and FeedbackIf you decide to give UniqueModel a try, I'd love to hear from you. Please send me an e-mail (my address is located to the right) and submit any issues you find to the issue tracker. If you need support, I can be reached by e-mail, or I can also be found lingering in #appengine on irc.freenode.net under the nick IAB. Simple ExampleAs an example, lets say I'm writing a simple game in which players interact on a rectangular 2D map. To represent a player's location on the map, I create a structure called locations in which I store the x and y value of the player's position, and the ID of an active player. I'm scared of ghosts and things that can be in multiple places at the same time. I know, it's an irrational fear, but it's my game dammit! In order to keep more than one player from occupying the same position on the map, and therefore keep the ghosts at bay, I need to make sure that a point is represented exactly 0 or 1 times in my locations structure. Also in order to be sure nobody occupies more than one position, I need to be sure that an active player_id is represented exactly 0 (an active player can be dead) or 1 times. In scientific terms, the property tuple (x,y) and the single property player_id must each be unique within the set of all locations In a typical SQL schema definition script this would be accomplished like so: CREATE TABLE locations( x integer, y integer, player_id integer references active_players(player_id), unique(x,y), unique(id), );There is no similar way of enforcing uniqueness in the App Engine Datastore. Fortunately to alleviate this problem, we can use UniqueModel: from google.appengine.ext import db import burns class Locations(burns.UniqueModel): x = db.IntegerProperty() y = db.IntegerProperty() player = db.ReferenceProperty(reference_class=ActivePlayer) _uniques = set([ (x,y), (player,) #note that single properties must be stored as tuple ])Just like with the SQL schema definition, if an insert or update violates the uniqueness constraint placed on (x,y) or player, an exception (UniqueConstraintViolatedError) is raised. No ghost-like or omni-present players to deal with! Under the HoodIn a nutshell, UniqueModel works by creating an MD5 hash of a property tuple to be kept unique and storing it on the datastore. When put() is called on an instance of UniqueModel the following happens: UniqueModel calculates a hash of each property tuple to be kept unique In a transaction, UniqueModel checks for the existence of these hashes in the datastore and does the following: If any of the hashes exist in the datastore, the transaction raises an exception Otherwise, each of the hashes are stored with a reference back to the UniqueModel that they represent. If the transaction raised an exception, UniqueModel calls delete on itself to undo the previous put() and then raises the exception again If no exception was raised, db.Model.put() is called and UniqueModel.put() returns normally LimitationsThere are a few limitations imposed by UniqueModel that the application developer should be aware of. Stores many additional EntitiesThe uniqueness of each kind is modeled as entities of kind UniquePropertyInstance. Given n entities of a specific kind that extends UniqueModel, n*m entities of kind UniquePropertyInstance must exist to model the uniqueness of that kind, where m is equal to the number of property tuples to be kept unique for the kind in question. Uniqueness is Modeled as an Entity GroupIn order to verify uniqueness in a method that is safe from concurrent put() operations, the uniquness must be tested and set in an atomic operation. The only way to do this with the App Engine datastore is to use a transaction. However, transactions have the unfortunate restriction that they can only operate on a single entity group. That means that in order to check the uniqueness for an entire kind, the entities that model the uniqueness for that kind must have a common parent. Because of all this, UniqueModel stores an entity of kind UniquePropertyInstanceParent for each kind that extends UniqueModel. This entity serves as the root of the entity group for a particular kind that extends UniqueModel. Therefore, for each kind extending UniqueModel, an entity group of size n*m+1 must exist to ensure uniqueness on concurrent put() operations, where n is equal to the number of entities of the kind and m is equal to the number of property tuples to be kept unique for the kind. Uniqueness is Verified in a TransactionAs stated above, it is necessary to check and submit the uniqueness of an entity within a transaction. Unfortunately, since transactions on the datastore do not nest UniqueModel.put() cannot be called from within a transaction. Uniqueness is Checked with MD5 HashesTo create the hash of a property tuple, each member of the tuple's __str__ method is called and concatenated. The resulting str is then passed to hashlib to be turned into an MD5 sum. This imposes the following problems: Extra CPU usage on put() (see http://uniqueificate.appspot.com,code for an example) Only works for properties with __str__ methods that accurately represent the property value Possibility (although very low likelihood) of forced collisions [Less]

97 lines of code

0 current contributors

over 8 years since last commit

0 users on Open Hub

Activity Not Available
0.0
 
I Use This

yaw-appengine

Compare

  No analysis available

This widget makes life easier by taking in models and some parameters. The output it produces is a form that can do all basic functions possible on that model. And what's more...it's fully AJAX based. No worries See this project live at http://pran.appspot.com The code is constantly being ... [More] updated at a very high rate, so there is no code release. I will release it, once i have a stable code ready with me [Less]

0 lines of code

0 current contributors

0 since last commit

0 users on Open Hub

Activity Not Available
0.0
 
I Use This
Mostly written in language not available
Licenses: GPL-3.0+

sluggable-mixin

Compare

  Analyzed about 1 year ago

This project allows a Google AppEngine developer to quickly and easily add URL-ready slugs to entities in the DataStore.

8.7K lines of code

0 current contributors

over 8 years since last commit

0 users on Open Hub

Activity Not Available
0.0
 
I Use This

datastore-sqlite-sync

Compare

  Analyzed over 2 years ago

This self-contained tool enables incremental synchronization of your data from Google AppEngine's Datastore to a local SQLite 3 database. It will automatically convert your model definitions into an SQL schema, and download the model data in parallel using the new remote_api. Synchronization is ... [More] currently unidirectional (i.e. Datastore to SQLite, not the other way around) due to a lack of need, and also at least one technical limitation. IntroductionThere are still many data processing tasks that aren't possible on AppEngine, to which a traditional SQL database is ideally suited. These include driving a full text indexer, running reports that aggregate complex statistics in multiple dimensions from your data, or asking ad-hoc questions that your application's counters weren't designed to support. This utility solves these problems, while enabling many more applications to integrate with your data with minimal time and effort. An SQL schema is automatically generated by introspecting the model classes defined in your application's source code, into which new and updated entities are automatically fetched with a single command. The database may be used for running interactive queries using one of the many free GUI management tools, or even hooking up to Microsoft Excel via ODBC. Additionally, by configuring a cron job it is possible to mirror your entire Datastore to an external location, for consumption by external software (for example, full text indexing, automated reporting and analysis, etc.) How This WorksThe magic __key__ property is used to search for new immutable entities, while any Model subclasses that define at least one DateTimeProperty with auto_now=True will be automatically re-fetched every time they are modified. This means that you gain completely hands-free replication to an effortlessly accessible SQL database simply by including a single DateTimeProperty in each of your mutable models. Model → SQL Schema MappingAll identifiers are mapped like so: ModelClassName → model_class_name, and ModelClassName.propertyName → model_class_name_property_name. The current code generates the following SQL objects: One TABLE for each Model subclass, containing a unique integer (id) and string-encoded Datastore Key. One TABLE for each property of each Model, containing an integer (class_name_id) and the actual value column (value). One VIEW for each Model, defined as a SELECT which joins together all the other tables into a single combined "SQL like" table. This is the most useful object. This design was chosen to make it trivial to add and drop columns on large data sets, without having to wait for large ALTER TABLE commands to run. It also makes it easier to define custom VIEWs on just the subset of your data you need. Type Mapping SQL Type Datastore Native Type TEXT Key, ByteString, !IM, User, GeoPt, PhoneNumber, !Email, etc. (all simple text fields) TIMESTAMP datetime, date, time INTEGER int, long, Rating REAL float ExampleThe following model: class Address(db.Model): careOf = db.StringProperty() line1 = db.StringProperty() line2 = db.StringProperty() lastUpdated = db.DateTimeProperty(auto_now=True)Would be mapped to: CREATE TABLE address_entity(id INTEGER PRIMARY KEY, ds_key TEXT NOT NULL); CREATE TABLE address_care_of(address_entity_id INT NOT NULL, value TEXT); CREATE TABLE address_line1(address_entity_id INT NOT NULL, value TEXT); CREATE TABLE address_line2(address_entity_id INT NOT NULL, value TEXT); CREATE TABLE address_last_updated(address_entity_id INT NOT NULL, value TIMESTAMP); CREATE UNIQUE INDEX address_entity_ds_key_index ON address_entity(ds_key); CREATE UNIQUE INDEX address_care_of_address_entity_id_index ON address_care_of(address_entity_id); CREATE UNIQUE INDEX address_line1_address_entity_id_index ON address_line1(address_entity_id); CREATE UNIQUE INDEX address_line2_address_entity_id_index ON address_line2(address_entity_id); CREATE UNIQUE INDEX address_last_updated_address_entity_id_index ON address_last_updated(address_entity_id); CREATE VIEW address AS SELECT address_entity.id AS __id, address_entity.ds_key AS __ds_key, address_care_of.value AS care_of, address_line1.value AS line1, address_line2.value AS line2, address_last_updated.value AS last_updated FROM address_entity LEFT JOIN address_care_of ON (address_care_of.address_entity_id = address_entity.id) LEFT JOIN address_line1 ON (address_line1.address_entity_id = address_entity.id) LEFT JOIN address_line2 ON (address_line2.address_entity_id = address_entity.id) LEFT JOIN address_last_updated ON (address_last_updated.address_entity_id = address_entity.id) -- Example query: SELECT COUNT(*) FROM address WHERE line1 LIKE '1 Infinite Loop%'UsageUsage: dsync.py [options] Options: -a AppEngine application name, e.g. "shell" -e Developer's e-mail address (default: prompt) -p Developer's password (default: prompt) -r remote_api path on server (default "/remote_api") -L Prepend extra path to module search path -m Load Model classes from module -d Local database path (default "./models.sqlite3") -x Exclude the given Model class -N Number of fetch worker threads (default: 5) -C Number of entities to fetch per request (default: 50) -v Verbose/debug output --batch Fail rather than prompt for a password if none is provided on the command line. --sdk-path= Path to AppEngine SDK (default: search). --trigger-cmd=":command" Arrange for the given system command to be executed after new or updated entities whose class matches ModelGlob fetched (glob may contain * and ?). --trigger-sql=":SQL STATEMENT" Arrange for the given SQL statement to be executed after new or updated entities whose class matches ModelGlob fetched (glob may contain * and ?). --help This message. Commands: sync-model Synchronize the database schema to match the loaded Model subclasses. fetch Start fetching data from Datastore to the database, synchronizing its schema if necessary. orphaned List properties and tables that no longer have associated definitions in the loaded Model classes. prune Remove properties and tables from local database that no longer have associated definitions in the loaded Model classes. Check "orphaned" output first to verify this won't result in data loss! Example: Backup "myapp.appspot.com"'s datastore to $HOME/myapp.db except for RemoteUrlCacheEntry: dsync.py -L $HOME/src -m myapp.models -m myapp.counters \ -d $HOME/myapp.db -x RemoteUrlCacheEntry \ -a myapp -e me@gmail.com -p 1234 \ fetchKnown IssuesThis code represents about 10 hours worth of work; it's still at a very early stage. Please see the issue tracker for a more definitive list of issues, but most importantly right now: With high worker thread counts, SQLite locks timeout. Not well tested. Only supports SQLite; PostgreSQL support would be great for very large data sets. I will be productionizing the code in the coming months, but if you come across a major issue, please tell me about it. ConfigurationLocal SystemBefore using the utility, you must have Python 2.5 and the Google AppEngine SDK installed on your working machine, along with a copy of your site's source code. The source code is currently necessary for introspecting your application's Datastore schema. The utility searches for the SDK in the default locations for Windows and OS X. If you have changed the install path for the SDK, you will need to specify it using a command line parameter. After completing these steps, you must configure a remote_api URI for your site as shown below. Application ChangesSimply add the following to app.yaml in your project directory: handlers: - url: /remote_api script: $PYTHON_LIB/google/appengine/ext/remote_api/handler.py login: adminIt is recommended that you use the URI shown above, however you can specify an alternative using a command-line parameter if you desire. This enables the remote_api handler built into AppEngine, which you can read more about here. Creating A Sync ScriptIt is recommended that you create a batch file or UNIX shell script to save the settings used for launching the utility. This helps reduce the potential for human error cropping up (for example, missing a library path). The main tasks required for this are picking a stable location for your site's source code, say, $HOME/src/my-site (or %USERPROFILE%\my-site for Windows users), and figuring out which modules in your source code contain Datastore model definitions. This will be easier if you use only a single file for declaring models. Here is an example sync script written in bash. It will prompt for me@gmail.com's password at startup, or you can hard-code it using the -p option. #!/bin/bash EMAIL=me@gmail.com APPNAME=mysite # mysite.appspot.com SOURCE_DIR=$HOME/src/my-site DATABASE_PATH=$HOME/data/my-site.sqlite3 TRIG="Member:$HOME/bin/index-new-members.sh" dsync.py \ -L $SOURCE_DIR \ -m my_site.models \ # Equivalent to $SOURCEDIR/my_site/models.py -m my_site.counter \ -a $APPNAME \ -e $EMAIL \ -N 5 \ # Start 5 worker threads. --trigger-cmd="$TRIG" \ # Trigger indexing when a new member joins. fetchStart Syncing$ ./my-sync-script.sh INFO:root:Server: my-site.appspot.com INFO:root:Session done: 1 added, 0 updated. INFO:root:Room done: 1 added, 0 updated. INFO:root:Invitation done: 2 added, 0 updated. INFO:root:Video done: 5 added, 0 updated. INFO:root:Item done: 5 added, 0 updated. INFO:root:Member done: 2 added, 0 updated. INFO:root:grand total: 1.73s for 16 added, 0 updated in 13 models. $ $ ./my-sync-script.sh INFO:root:Server: my-site.appspot.com INFO:root:grand total: 0.97s for 0 added, 0 updated in 13 models.If all has gone well, you should now have a local SQLite copy of your Datastore! Run Some Queries$ sqlite3 $HOME/data/my-site.sqlite3 SQLite version 3.5.7 Enter ".help" for instructions sqlite> .mode column sqlite> SELECT COUNT(*) FROM session; COUNT(*) ---------- 1 sqlite> SELECT session_id, display_name, last_seen, last_address, expiry_time, first_seen FROM session; session_id display_name last_seen last_address expiry_time first_seen -------------------------------- ------------ ------------------- -------------- ----------- ------------------- 0ed373e4a697d2fed5b7e6b00f462423 Anonymous 2009-02-24 15:02:45 80.127.152.220 2009-02-24 12:16:03Feed A Starving Hacker! [Less]

677 lines of code

0 current contributors

over 8 years since last commit

0 users on Open Hub

Activity Not Available
0.0
 
I Use This

skatemaps

Compare

  Analyzed about 1 year ago

A simple web application built on the Google App Engine using Google Maps API and Friend Connect. Goals: Search for parks in your area using Google Maps API. Use Google APIs (Friend Connect, Open Social) to create social glue for website. Park details should allow user(s) to comment, review ... [More] , upload pics/videos, create events, suggest information, etc. Users should be able to add a new park. Requires admin moderation. [Less]

0 lines of code

0 current contributors

almost 8 years since last commit

0 users on Open Hub

Activity Not Available
0.0
 
I Use This
Mostly written in language not available
Licenses: LGPL

django-gae-cache

Compare

  Analyzed about 1 year ago

Welcome to Django-gae-cacheThis project contains Django middleware and Google App Engine application. MotivationI have few small web application on poor VPS (Virtual Private Server). Because there is everything on single server (Django, MySQL server, static content), I though how to lower server ... [More] load/save memory/whatever. And this project is a result of my invention. Using this application, you will save bandwidth, memory and CPU load of your web server, because all static resources will be served with Google App Engine. Installation InstructionsThese instructions was written for version 1.0.0, but following versions are backward compatible. Download and installInstall package using distutils: svn checkout http://django-gae-cache.googlecode.com/svn/trunk/ django-gae-cache cd django-gae-cache sudo python setup.py installUpdate your settings.py... and add the middleware and installed apps settings: # Something long and unpredictable GAE_CACHE_SECRET_KEY = 'qwertyuiop' # URL of your google app GAE_CACHE_URL = 'http://yourapp.appspot.com' # Turn on/off GAE cache GAE_CACHE_USE = True MIDDLEWARE_CLASSES = ( ... 'django_gae_cache.middleware.DjangoGaeCacheMiddleware', ... ) INSTALLED_APPS = ( ... 'django_gae_cache', ... )Signup for Google app engine... and download Google SDK to your computer. http://code.google.com/appengine/ Configure GAE applicationGo to PACKAGE_DIR/gae-cache, copy _app.yaml to app.yaml and _config.py to config.py In app.yaml, replace YourAppName with name of your Google App Engine application. In config.py, fill SOURCE_URL with name of your site. Also set up SECRET_KEY to the same string, as is settings.GAE_CACHE_SECRET_KEY in your Django application. Also set MEDIA_URL to the same string, as is settings.MEDIA_URL in your Django application. Deploy GAE applicationIn PACKAGE_DIR/gae-cache, run "appcfg.py update ." (appcfg.py is part of GAE SDK, which you downloaded in 3rd step). Now visit http://yourapp.appspot.com. You should be redirected to your site or see 403 Forbidden (depends on gae-cache configuration). It is OK. Find some existing resource file on your site (http://yourdjangosite.com/media_url/some_file.jpg and call them thru appspot: http://yourapp.appspot.com/media_url/some_file.jpg. Working? Great! UsageCaching mechanism is working out of the box. Immediately after setting up middleware and GAE application, your HTML pages are searched for hyperlinks to static resources and replaced by URL to the same resource using GAE application. On first call of resource, GAE application downloads static resource from your server and save it into memcache. On next request, resource is loaded from memcache, without request to your web server. Invalidate cacheSometimes you want to change or delete resource. You can invalidate one specific file or whole GAE cache: from django_gae_cache import api a = api.Api() a.Invalidate('/relative/path/to/resource.jpg') # or a.InvalidateAll()That's all! You can use that for example in save()/delete() method of your Resource model. Memcache principeOn GAE application, resources are stored in memcache structures (there is 10GB free quota on GAE, which is enough for middle-size site). Memcache will free the oldest unused resources, when there application need space for newer resource. It means, we don't care of any advanced cache management, because in memcache are the most useful items at every time... TODOMy web application is far to GAE quota limits, so GAE cache dont have any fallback mechanism yet. It means, that resources are always redirected to GAE and Django dont care, if GAE application is working (GAE is down or over free quota). If you are near to quota limits, it is good idea to enable billing. In near future, I will add some intelligence to Django middleware, which will be able to decide, if resource should be redirected to GAE or not. CONCLUSIONLet me know, if you are using this GAE cache in your project! [Less]

331 lines of code

0 current contributors

over 7 years since last commit

0 users on Open Hub

Activity Not Available
0.0
 
I Use This

gdatastore

Compare

  Analyzed about 1 year ago

GDatastore lets you easily use a Google Spreadsheet (as a database) in your Google App Engine projects. GDatastore can bind with existing Datastore code or be invoked manually. It extends the functionality of Jeffrey Scudder's wonderful text_db.py (part of the GData Python Client Library) to those of us using Google App Engine.

0 lines of code

0 current contributors

0 since last commit

0 users on Open Hub

Activity Not Available
0.0
 
I Use This
Mostly written in language not available
Licenses: LGPL

gae-datastore-backup-utility

Compare

  Analyzed about 1 year ago

The Google App Engine Datastore Backup UtilityEasy backup GUI for the Google App Engine datastore. Please submit bugs! (2/17/2009) Please note that this was built based on the 1.2.5 API from October 2009. I am not currently working on this application but may continue in the future when school ... [More] lightens up. Feel free to submit code fixes to keep it up to date.Technologies used in the applicationWPF C# Python XML Google App Engine SDK Note: Currently this application is only for Windows Platforms. If there is enough interest I can port this application to other operating systems. If you would like to contribute this let me know. Need to knowsSee requirements Wiki for further information. Windows only (currently) Python based GAE apps only Requires some steps for installation Future versions of the GAE SDK may break version 1.0 since bulkloader.py from the SDK is in testing. Currently the commands used by this application are in the experimental stages as posted here: http://code.google.com/appengine/docs/python/tools/uploadingdata.html#Downloading_and_Uploading_All_Data . When the SDK updates, I will try my best to release an update for this application since currently this application is relying on the experimental features. If you're looking for me you can find me over at http://www.jsndev.net [Less]

941 lines of code

0 current contributors

over 7 years since last commit

0 users on Open Hub

Activity Not Available
0.0
 
I Use This

waverscount

Compare

  No analysis available

How many users are we on wave.google.com?I think we won't get that answer from Google. Here is an attempt to do an unofficial count of users. I implemented a tiny robot (waverscount@appspot.com) that counts users in the waves. Lets see how many we can get!!!Your help is needed for this. You can ... [More] help by spreading this robot across waves. If your are not the owner, please ask before adding it. How is it implemented?The robot computes hash values (SHA-1) of all participating wave addresses and puts them in a datastore. This data is never retrieved by the robot and the addresses can't be reconstructed from hash values, the hashes are only needed to keep duplicates out of the datastore. It doesn't add or manipulate content in waves it has been added to. Where can I see the current count?Visit the robots appspot address: http://waverscount.appspot.com The counts are displayed quite simple for now and are updated every 30 minutes. Inside Wave you can request the latest stats by writing waverscount.getCounts() in a blip. Waverscount will replace the text inside the requesting blip with its latest stats. [Less]

0 lines of code

0 current contributors

0 since last commit

0 users on Open Hub

Activity Not Available
0.0
 
I Use This
Mostly written in language not available
Licenses: Apache-2.0