Merge branch 'master' into shareext
* master: (801 commits) Fixing styling of tags and font sizing in story header on web. Use ReadingAction for mute/unmute. Fixing drag on saved stories and social stories. Also fixing background color on profile activities. Fixing username sizing on blurblog. Thanks to J. Leeuwen for finding this. Fixing bug when deleting feeds. Adding preference for left-swiping feed titles to either show notifications (default) or trainer (old default). Alphabetizing notifications. Saving notifications on iOS. Just need to update local feed dict on save. Mutli select segmented control. Now just need to save and we're done. Selecting correct notification filter. Notification type will need a custom segmented control to handle multiple selections. Finished styling notifications on ios. Needs web hookup. Feeds are now loaded in notifications table. Now to build the cells and save notification types. Stubbing in notifications table. Now needs notification feed cells. Stubbing in notifications editor on iOS. Hiding notifications from non-staff on web in feed popover. Android v5.0.0b2 Fixing date/author/tags size on web. Only include active feeds when marking a folder as read. clarify ambiguous preview option fix story row bordering (#963) ...
9
.gitignore
vendored
|
@ -35,6 +35,7 @@ config/secrets
|
|||
templates/maintenance_on.html
|
||||
vendor/mms-agent/settings.py
|
||||
apps/social/spam.py
|
||||
venv
|
||||
|
||||
# ----------------------
|
||||
# Android
|
||||
|
@ -52,15 +53,19 @@ apps/social/spam.py
|
|||
# generated files
|
||||
clients/android/NewsBlur/bin/
|
||||
clients/android/NewsBlur/gen/
|
||||
clients/android/NewsBlur/app/
|
||||
clients/android/NewsBlurTest/bin/
|
||||
clients/android/NewsBlurTest/gen/
|
||||
|
||||
# Local configuration file (sdk path, etc)
|
||||
clients/android/NewsBlur/local.properties
|
||||
clients/android/NewsBlurTest/local.properties
|
||||
originals
|
||||
/originals
|
||||
media/safari/NewsBlur.safariextz
|
||||
|
||||
# IDE files
|
||||
clients/android/NewsBlur/.idea
|
||||
.tm_properties
|
||||
clients/android/NewsBlur/.gradle
|
||||
clients/android/NewsBlur/build.gradle
|
||||
clients/android/NewsBlur/gradle*
|
||||
clients/android/NewsBlur/settings.gradle
|
||||
|
|
2
.tm_properties
Normal file
|
@ -0,0 +1,2 @@
|
|||
exclude = '{$exclude,*.tgz,*.gz,static/*.js,static/*.css}'
|
||||
excludeDirectories = "{$excludeDirectories,logs,data,clients/android,media/fonts,node_modules,venv,fonts,clients}"
|
35
README.md
|
@ -275,20 +275,29 @@ reader, and feed importer. To run the test suite:
|
|||
|
||||
You got the downtime message either through email or SMS. This is the order of operations for determining what's wrong.
|
||||
|
||||
0. Ensure you have `secrets-newsblur/configs/hosts` installed in your `/etc/hosts` so server hostnames
|
||||
0a. If downtime goes over 5 minutes, go to Twitter and say you're handling it. Be transparent about what it is,
|
||||
NewsBlur's followers are largely technical. Also the 502 page points users to Twitter for status updates.
|
||||
|
||||
0b. Ensure you have `secrets-newsblur/configs/hosts` installed in your `/etc/hosts` so server hostnames
|
||||
work.
|
||||
|
||||
1. Check www.newsblur.com to confirm it's down.
|
||||
|
||||
If you don't get a 502 page, then NewsBlur isn't even reachable and you just need to contact the
|
||||
hosting provider and yell at them.
|
||||
If you don't get a 502 page, then NewsBlur isn't even reachable and you just need to contact [the
|
||||
hosting provider](http://cloud.digitalocean.com/support) and yell at them.
|
||||
|
||||
2. Check [Sentry](https://app.getsentry.com/newsblur/app/) and see if the answer is at the top of the
|
||||
list.
|
||||
|
||||
This will show if a database (redis, mongo, postgres) can't be found.
|
||||
|
||||
3. Check the various databases:
|
||||
3. Check which servers can't be reached on HAProxy stats page. Basic auth can be found in secrets/configs/haproxy.conf.
|
||||
|
||||
Typically it'll be mongo, but any of the redis or postgres servers can be unreachable due to
|
||||
acts of god. Otherwise, a frequent cause is lack of disk space. There are monitors on every DB
|
||||
server watching for disk space, emailing me when they're running low, but it still happens.
|
||||
|
||||
4. Check the various databases:
|
||||
|
||||
a. If Redis server (db_redis, db_redis_story, db_redis_pubsub) can't connect, redis is probably down.
|
||||
|
||||
|
@ -313,6 +322,14 @@ You got the downtime message either through email or SMS. This is the order of o
|
|||
it's ephemeral and used for, you guessed it, analytics). You can easily provision a new mongodb
|
||||
server and point to that machine.
|
||||
|
||||
If mongo is out of space, which happens, the servers need to be re-synced every 2-3 months to
|
||||
compress the data bloat. Simply `rm -fr /var/lib/mongodb/*` and re-start Mongo. It will re-sync.
|
||||
|
||||
If both secondaries are down, then the primary Mongo will go down. You'll need a secondary mongo
|
||||
in the sync state at the very least before the primary will accept reads. It shouldn't take long to
|
||||
get into that state, but you'll need a mongodb machine setup. You can immediately reuse the
|
||||
non-working secondary if disk space is the only issue.
|
||||
|
||||
c. If postgresql (db_pgsql) can't connect, postgres is probably down.
|
||||
|
||||
This is the rarest of the rare and has in fact never happened. Machine failure. If you can salvage
|
||||
|
@ -329,7 +346,6 @@ You got the downtime message either through email or SMS. This is the order of o
|
|||
|
||||
```
|
||||
fab all setup_hosts
|
||||
fab ec2task setup_hosts
|
||||
```
|
||||
|
||||
d. Changes should be instant, but you can also bounce every machine with:
|
||||
|
@ -337,10 +353,9 @@ You got the downtime message either through email or SMS. This is the order of o
|
|||
```
|
||||
fab web deploy:fast=True # fast=True just kill -9's processes.
|
||||
fab task celery
|
||||
fab ec2task celery
|
||||
```
|
||||
|
||||
e. Monitor tlnb.py and tlnbt.py for lots of reading and feed fetching.
|
||||
e. Monitor `utils/tlnb.py` and `utils/tlnbt.py` for lots of reading and feed fetching.
|
||||
|
||||
5. If feeds aren't fetching, check that the `tasked_feeds` queue is empty. You can drain it by running:
|
||||
|
||||
|
@ -348,7 +363,11 @@ You got the downtime message either through email or SMS. This is the order of o
|
|||
Feed.drain_task_feeds()
|
||||
```
|
||||
|
||||
This happens when a deploy on the task servers hits faults and the task servers lose their connection without giving the tasked feeds back to the queue. Feeds that fall through this crack are automatically fixed after 24 hours, but if many feeds fall through due to a bad deploy, you'll want to accelerate that check by just draining the tasked feeds pool, adding those feeds back into the queue.
|
||||
This happens when a deploy on the task servers hits faults and the task servers lose their
|
||||
connection without giving the tasked feeds back to the queue. Feeds that fall through this
|
||||
crack are automatically fixed after 24 hours, but if many feeds fall through due to a bad
|
||||
deploy or electrical failure, you'll want to accelerate that check by just draining the
|
||||
tasked feeds pool, adding those feeds back into the queue. This command is idempotent.
|
||||
|
||||
## Author
|
||||
|
||||
|
|
|
@ -183,6 +183,17 @@ class API:
|
|||
data.append( ("feeds", feed) )
|
||||
return data
|
||||
|
||||
@request('reader/mark_story_hashes_as_read')
|
||||
def mark_story_hashes_as_read(self, story_hashes):
|
||||
'''
|
||||
Mark stories as read using their unique story_hash.
|
||||
'''
|
||||
|
||||
data = []
|
||||
for hash in story_hashes:
|
||||
data.append( ("story_hash", hash) )
|
||||
return data
|
||||
|
||||
@request('reader/mark_story_as_read')
|
||||
def mark_story_as_read(self, feed_id, story_ids):
|
||||
'''
|
||||
|
|
|
@ -3,6 +3,7 @@ import base64
|
|||
import urlparse
|
||||
import datetime
|
||||
import lxml.html
|
||||
from django import forms
|
||||
from django.conf import settings
|
||||
from django.http import HttpResponse
|
||||
from django.shortcuts import render_to_response
|
||||
|
@ -13,10 +14,12 @@ from apps.reader.forms import SignupForm, LoginForm
|
|||
from apps.profile.models import Profile
|
||||
from apps.social.models import MSocialProfile, MSharedStory, MSocialSubscription
|
||||
from apps.rss_feeds.models import Feed
|
||||
from apps.rss_feeds.text_importer import TextImporter
|
||||
from apps.reader.models import UserSubscription, UserSubscriptionFolders, RUserStory
|
||||
from utils import json_functions as json
|
||||
from utils import log as logging
|
||||
from utils.feed_functions import relative_timesince
|
||||
from utils.view_functions import required_params
|
||||
|
||||
|
||||
@json.json_view
|
||||
|
@ -53,10 +56,13 @@ def signup(request):
|
|||
if form.errors:
|
||||
errors = form.errors
|
||||
if form.is_valid():
|
||||
new_user = form.save()
|
||||
login_user(request, new_user)
|
||||
logging.user(request, "~FG~SB~BBAPI NEW SIGNUP: ~FW%s / %s" % (new_user.email, ip))
|
||||
code = 1
|
||||
try:
|
||||
new_user = form.save()
|
||||
login_user(request, new_user)
|
||||
logging.user(request, "~FG~SB~BBAPI NEW SIGNUP: ~FW%s / %s" % (new_user.email, ip))
|
||||
code = 1
|
||||
except forms.ValidationError, e:
|
||||
errors = [e.args[0]]
|
||||
else:
|
||||
errors = dict(method="Invalid method. Use POST. You used %s" % request.method)
|
||||
|
||||
|
@ -171,14 +177,18 @@ def check_share_on_site(request, token):
|
|||
except Profile.DoesNotExist:
|
||||
code = -1
|
||||
|
||||
logging.user(request.user, "~FBFinding feed (check_share_on_site): %s" % rss_url)
|
||||
feed = Feed.get_feed_from_url(rss_url, create=False, fetch=False)
|
||||
if not feed:
|
||||
logging.user(request.user, "~FBFinding feed (check_share_on_site): %s" % story_url)
|
||||
feed = Feed.get_feed_from_url(story_url, create=False, fetch=False)
|
||||
if not feed:
|
||||
parsed_url = urlparse.urlparse(story_url)
|
||||
base_url = "%s://%s%s" % (parsed_url.scheme, parsed_url.hostname, parsed_url.path)
|
||||
logging.user(request.user, "~FBFinding feed (check_share_on_site): %s" % base_url)
|
||||
feed = Feed.get_feed_from_url(base_url, create=False, fetch=False)
|
||||
if not feed:
|
||||
logging.user(request.user, "~FBFinding feed (check_share_on_site): %s" % (base_url + '/'))
|
||||
feed = Feed.get_feed_from_url(base_url+'/', create=False, fetch=False)
|
||||
|
||||
if feed and user:
|
||||
|
@ -232,40 +242,61 @@ def check_share_on_site(request, token):
|
|||
|
||||
return response
|
||||
|
||||
@required_params('story_url', 'comments', 'title')
|
||||
def share_story(request, token=None):
|
||||
code = 0
|
||||
story_url = request.POST['story_url']
|
||||
comments = request.POST['comments']
|
||||
title = request.POST['title']
|
||||
content = request.POST['content']
|
||||
rss_url = request.POST.get('rss_url')
|
||||
feed_id = request.POST.get('feed_id') or 0
|
||||
story_url = request.REQUEST['story_url']
|
||||
comments = request.REQUEST['comments']
|
||||
title = request.REQUEST['title']
|
||||
content = request.REQUEST.get('content', None)
|
||||
rss_url = request.REQUEST.get('rss_url', None)
|
||||
feed_id = request.REQUEST.get('feed_id', None) or 0
|
||||
feed = None
|
||||
message = None
|
||||
profile = None
|
||||
|
||||
if not story_url:
|
||||
code = -1
|
||||
elif request.user.is_authenticated():
|
||||
if request.user.is_authenticated():
|
||||
profile = request.user.profile
|
||||
else:
|
||||
try:
|
||||
profile = Profile.objects.get(secret_token=token)
|
||||
except Profile.DoesNotExist:
|
||||
code = -1
|
||||
if token:
|
||||
message = "Not authenticated, couldn't find user by token."
|
||||
else:
|
||||
message = "Not authenticated, no token supplied and not authenticated."
|
||||
|
||||
|
||||
if not profile:
|
||||
return HttpResponse(json.encode({
|
||||
'code': code,
|
||||
'message': message,
|
||||
'story': None,
|
||||
}), mimetype='text/plain')
|
||||
|
||||
if feed_id:
|
||||
feed = Feed.get_by_id(feed_id)
|
||||
else:
|
||||
if rss_url:
|
||||
logging.user(request.user, "~FBFinding feed (share_story): %s" % rss_url)
|
||||
feed = Feed.get_feed_from_url(rss_url, create=True, fetch=True)
|
||||
if not feed:
|
||||
logging.user(request.user, "~FBFinding feed (share_story): %s" % story_url)
|
||||
feed = Feed.get_feed_from_url(story_url, create=True, fetch=True)
|
||||
if feed:
|
||||
feed_id = feed.pk
|
||||
|
||||
content = lxml.html.fromstring(content)
|
||||
content.make_links_absolute(story_url)
|
||||
content = lxml.html.tostring(content)
|
||||
if content:
|
||||
content = lxml.html.fromstring(content)
|
||||
content.make_links_absolute(story_url)
|
||||
content = lxml.html.tostring(content)
|
||||
else:
|
||||
importer = TextImporter(story=None, story_url=story_url, request=request, debug=settings.DEBUG)
|
||||
document = importer.fetch(skip_save=True, return_document=True)
|
||||
content = document['content']
|
||||
if not title:
|
||||
title = document['title']
|
||||
|
||||
shared_story = MSharedStory.objects.filter(user_id=profile.user.pk,
|
||||
story_feed_id=feed_id,
|
||||
|
@ -289,6 +320,7 @@ def share_story(request, token=None):
|
|||
socialsub.needs_unread_recalc = True
|
||||
socialsub.save()
|
||||
logging.user(profile.user, "~BM~FYSharing story from site: ~SB%s: %s" % (story_url, comments))
|
||||
message = "Sharing story from site: %s: %s" % (story_url, comments)
|
||||
else:
|
||||
shared_story.story_content = content
|
||||
shared_story.story_title = title
|
||||
|
@ -299,7 +331,7 @@ def share_story(request, token=None):
|
|||
shared_story.story_feed_id = feed_id
|
||||
shared_story.save()
|
||||
logging.user(profile.user, "~BM~FY~SBUpdating~SN shared story from site: ~SB%s: %s" % (story_url, comments))
|
||||
|
||||
message = "Updating shared story from site: %s: %s" % (story_url, comments)
|
||||
try:
|
||||
socialsub = MSocialSubscription.objects.get(user_id=profile.user.pk,
|
||||
subscription_user_id=profile.user.pk)
|
||||
|
@ -319,7 +351,7 @@ def share_story(request, token=None):
|
|||
response = HttpResponse(json.encode({
|
||||
'code': code,
|
||||
'message': message,
|
||||
'story': None,
|
||||
'story': shared_story,
|
||||
}), mimetype='text/plain')
|
||||
response['Access-Control-Allow-Origin'] = '*'
|
||||
response['Access-Control-Allow-Methods'] = 'POST'
|
||||
|
|
|
@ -15,7 +15,6 @@ class MCategory(mongo.Document):
|
|||
'collection': 'category',
|
||||
'indexes': ['title'],
|
||||
'allow_inheritance': False,
|
||||
'index_drop_dups': True,
|
||||
}
|
||||
|
||||
def __unicode__(self):
|
||||
|
@ -100,7 +99,6 @@ class MCategorySite(mongo.Document):
|
|||
'collection': 'category_site',
|
||||
'indexes': ['feed_id', 'category_title'],
|
||||
'allow_inheritance': False,
|
||||
'index_drop_dups': True,
|
||||
}
|
||||
|
||||
def __unicode__(self):
|
||||
|
|
|
@ -16,7 +16,7 @@
|
|||
<outline text="very small array" description="" title="very small array" type="rss" version="RSS" htmlUrl="http://www.verysmallarray.com" xmlUrl="http://www.verysmallarray.com/?feed=rss2"/>
|
||||
<outline text="Frugal Traveler" description="" title="Frugal Traveler" type="rss" version="RSS" htmlUrl="http://frugaltraveler.blogs.nytimes.com/" xmlUrl="http://frugaltraveler.blogs.nytimes.com/feed/"/>
|
||||
</outline>
|
||||
<outline text="Tech" title="Tech">
|
||||
<outline text="tech" title="tech">
|
||||
<outline text="Joel on Software" description="" title="Joel on Software" type="rss" version="RSS" htmlUrl="http://www.joelonsoftware.com" xmlUrl="http://www.joelonsoftware.com/rss.xml"/>
|
||||
<outline text="Daring Fireball" description="" title="Daring Fireball" type="rss" version="RSS" htmlUrl="http://daringfireball.net/" xmlUrl="http://daringfireball.net/index.xml"/>
|
||||
<outline text="Techcrunch" description="" title="Techcrunch" type="rss" version="RSS" htmlUrl="http://techcrunch.com" xmlUrl="http://feeds.feedburner.com/Techcrunch"/>
|
||||
|
|
|
@ -1,4 +1,12 @@
|
|||
[
|
||||
{
|
||||
"pk": 2,
|
||||
"model": "sites.site",
|
||||
"fields": {
|
||||
"domain": "testserver",
|
||||
"name": "testserver"
|
||||
}
|
||||
},
|
||||
{
|
||||
"pk": 1,
|
||||
"model": "auth.user",
|
||||
|
@ -16,5 +24,13 @@
|
|||
"email": "samuel@newsblur.com",
|
||||
"date_joined": "2009-01-04 17:32:58"
|
||||
}
|
||||
},
|
||||
{
|
||||
"pk": 1,
|
||||
"model": "reader.usersubscriptionfolders",
|
||||
"fields": {
|
||||
"folders": "[{\"Tech\": [4, 5]}, 1, 2, 3, 6]",
|
||||
"user": 1
|
||||
}
|
||||
}
|
||||
]
|
|
@ -12,7 +12,7 @@ class ImportTest(TestCase):
|
|||
|
||||
def setUp(self):
|
||||
self.client = Client()
|
||||
|
||||
|
||||
def test_opml_import(self):
|
||||
self.client.login(username='conesus', password='test')
|
||||
user = User.objects.get(username='conesus')
|
||||
|
@ -20,22 +20,24 @@ class ImportTest(TestCase):
|
|||
# Verify user has no feeds
|
||||
subs = UserSubscription.objects.filter(user=user)
|
||||
self.assertEquals(subs.count(), 0)
|
||||
|
||||
|
||||
f = open(os.path.join(os.path.dirname(__file__), 'fixtures/opml.xml'))
|
||||
response = self.client.post(reverse('opml-upload'), {'file': f})
|
||||
self.assertEquals(response.status_code, 200)
|
||||
|
||||
|
||||
# Verify user now has feeds
|
||||
subs = UserSubscription.objects.filter(user=user)
|
||||
self.assertEquals(subs.count(), 54)
|
||||
|
||||
usf = UserSubscriptionFolders.objects.get(user=user)
|
||||
print json.decode(usf.folders)
|
||||
self.assertEquals(json.decode(usf.folders), [{u'Tech': [4, 5, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28]}, 1, 2, 3, 6, {u'New York': [1, 2, 3, 4, 5, 6, 7, 8, 9]}, {u'tech': []}, {u'Blogs': [29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, {u'The Bloglets': [45, 46, 47, 48, 49]}]}, {u'Cooking': [50, 51, 52, 53]}, 54])
|
||||
|
||||
def test_opml_import__empty(self):
|
||||
self.client.login(username='conesus', password='test')
|
||||
user = User.objects.get(username='conesus')
|
||||
|
||||
# Verify user has no feeds
|
||||
# Verify user has default feeds
|
||||
subs = UserSubscription.objects.filter(user=user)
|
||||
self.assertEquals(subs.count(), 0)
|
||||
|
||||
|
@ -60,4 +62,6 @@ class ImportTest(TestCase):
|
|||
self.assertEquals(subs.count(), 66)
|
||||
|
||||
usf = UserSubscriptionFolders.objects.get(user=user)
|
||||
self.assertEquals(json.decode(usf.folders), [{'Blogs \xe2\x80\x94 The Bloglets': [6, 16, 22, 35, 51, 56]}, {'Blogs': [1, 3, 25, 29, 30, 39, 40, 41, 50, 55, 57, 58, 59, 60, 66]}, {'Cooking': [11, 15, 42, 43, 46]}, {'New York': [7, 8, 17, 18, 19, 36, 45, 47, 52, 61]}, {'Tech': [2, 4, 9, 10, 12, 13, 14, 20, 23, 24, 26, 27, 28, 31, 32, 33, 34, 48, 49, 62, 64]}, {'Blogs \xe2\x80\x94 Tumblrs': [5, 21, 37, 38, 53, 54, 63, 65]}, 44])
|
||||
# print json.decode(usf.folders)
|
||||
self.assertEquals(json.decode(usf.folders), [{u'Tech': [4, 5, 2, 9, 10, 12, 13, 14, 20, 23, 24, 26, 27, 28, 31, 32, 33, 34, 48, 49, 62, 64]}, 1, 2, 3, 6, {u'Blogs': [1, 3, 25, 29, 30, 39, 40, 41, 50, 55, 57, 58, 59, 60, 66]}, {u'Blogs \u2014 Tumblrs': [5, 21, 37, 38, 53, 54, 63, 65]}, {u'Blogs \u2014 The Bloglets': [6, 16, 22, 35, 51, 56]}, {u'New York': [7, 8, 17, 18, 19, 36, 45, 47, 52, 61]}, {u'Cooking': [11, 15, 42, 43, 46]}, 44])
|
||||
|
0
apps/newsletters/__init__.py
Normal file
189
apps/newsletters/models.py
Normal file
|
@ -0,0 +1,189 @@
|
|||
import datetime
|
||||
import re
|
||||
import redis
|
||||
from cgi import escape
|
||||
from django.db import models
|
||||
from django.contrib.auth.models import User
|
||||
from django.contrib.sites.models import Site
|
||||
from django.core.mail import EmailMultiAlternatives
|
||||
from django.core.urlresolvers import reverse
|
||||
from django.conf import settings
|
||||
from django.template.loader import render_to_string
|
||||
from django.utils.html import linebreaks
|
||||
from apps.rss_feeds.models import Feed, MStory, MFetchHistory
|
||||
from apps.reader.models import UserSubscription, UserSubscriptionFolders
|
||||
from apps.profile.models import Profile, MSentEmail
|
||||
from utils import log as logging
|
||||
from utils.story_functions import linkify
|
||||
from utils.scrubber import Scrubber
|
||||
|
||||
class EmailNewsletter:
|
||||
|
||||
def receive_newsletter(self, params):
|
||||
user = self._user_from_email(params['recipient'])
|
||||
if not user:
|
||||
return
|
||||
|
||||
sender_name, sender_username, sender_domain = self._split_sender(params['from'])
|
||||
feed_address = self._feed_address(user, "%s@%s" % (sender_username, sender_domain))
|
||||
|
||||
usf = UserSubscriptionFolders.objects.get(user=user)
|
||||
usf.add_folder('', 'Newsletters')
|
||||
|
||||
try:
|
||||
feed = Feed.objects.get(feed_address=feed_address)
|
||||
except Feed.DoesNotExist:
|
||||
feed = Feed.objects.create(feed_address=feed_address,
|
||||
feed_link='http://' + sender_domain,
|
||||
feed_title=sender_name,
|
||||
fetched_once=True,
|
||||
known_good=True)
|
||||
feed.update()
|
||||
logging.user(user, "~FCCreating newsletter feed: ~SB%s" % (feed))
|
||||
r = redis.Redis(connection_pool=settings.REDIS_PUBSUB_POOL)
|
||||
r.publish(user.username, 'reload:%s' % feed.pk)
|
||||
self._check_if_first_newsletter(user)
|
||||
|
||||
if feed.feed_title != sender_name:
|
||||
feed.feed_title = sender_name
|
||||
feed.save()
|
||||
|
||||
try:
|
||||
usersub = UserSubscription.objects.get(user=user, feed=feed)
|
||||
except UserSubscription.DoesNotExist:
|
||||
_, _, usersub = UserSubscription.add_subscription(
|
||||
user=user,
|
||||
feed_address=feed_address,
|
||||
folder='Newsletters'
|
||||
)
|
||||
r = redis.Redis(connection_pool=settings.REDIS_PUBSUB_POOL)
|
||||
r.publish(user.username, 'reload:feeds')
|
||||
|
||||
story_hash = MStory.ensure_story_hash(params['signature'], feed.pk)
|
||||
story_content = self._get_content(params)
|
||||
plain_story_content = self._get_content(params, force_plain=True)
|
||||
if len(plain_story_content) > len(story_content):
|
||||
story_content = plain_story_content
|
||||
story_content = self._clean_content(story_content)
|
||||
story_params = {
|
||||
"story_feed_id": feed.pk,
|
||||
"story_date": datetime.datetime.fromtimestamp(int(params['timestamp'])),
|
||||
"story_title": params['subject'],
|
||||
"story_content": story_content,
|
||||
"story_author_name": params['from'],
|
||||
"story_permalink": "https://%s%s" % (
|
||||
Site.objects.get_current().domain,
|
||||
reverse('newsletter-story',
|
||||
kwargs={'story_hash': story_hash})),
|
||||
"story_guid": params['signature'],
|
||||
}
|
||||
print story_params
|
||||
try:
|
||||
story = MStory.objects.get(story_hash=story_hash)
|
||||
except MStory.DoesNotExist:
|
||||
story = MStory(**story_params)
|
||||
story.save()
|
||||
|
||||
usersub.needs_unread_recalc = True
|
||||
usersub.save()
|
||||
|
||||
self._publish_to_subscribers(feed)
|
||||
|
||||
MFetchHistory.add(feed_id=feed.pk, fetch_type='push')
|
||||
logging.user(user, "~FCNewsletter feed story: ~SB%s~SN / ~SB%s" % (story.story_title, feed))
|
||||
|
||||
return story
|
||||
|
||||
def _check_if_first_newsletter(self, user, force=False):
|
||||
if not user.email:
|
||||
return
|
||||
|
||||
subs = UserSubscription.objects.filter(user=user)
|
||||
found_newsletter = False
|
||||
for sub in subs:
|
||||
if sub.feed.is_newsletter:
|
||||
found_newsletter = True
|
||||
break
|
||||
if not found_newsletter and not force:
|
||||
return
|
||||
|
||||
params = dict(receiver_user_id=user.pk, email_type='first_newsletter')
|
||||
try:
|
||||
sent_email = MSentEmail.objects.get(**params)
|
||||
if not force:
|
||||
# Return if email already sent
|
||||
return
|
||||
except MSentEmail.DoesNotExist:
|
||||
sent_email = MSentEmail.objects.create(**params)
|
||||
|
||||
text = render_to_string('mail/email_first_newsletter.txt', {})
|
||||
html = render_to_string('mail/email_first_newsletter.xhtml', {})
|
||||
subject = "Your email newsletters are now being sent to NewsBlur"
|
||||
msg = EmailMultiAlternatives(subject, text,
|
||||
from_email='NewsBlur <%s>' % settings.HELLO_EMAIL,
|
||||
to=['%s <%s>' % (user, user.email)])
|
||||
msg.attach_alternative(html, "text/html")
|
||||
msg.send(fail_silently=True)
|
||||
|
||||
logging.user(user, "~BB~FM~SBSending first newsletter email to: %s" % user.email)
|
||||
|
||||
def _user_from_email(self, email):
|
||||
tokens = re.search('(\w+)[\+\-\.](\w+)@newsletters.newsblur.com', email)
|
||||
if not tokens:
|
||||
return
|
||||
|
||||
username, secret_token = tokens.groups()
|
||||
try:
|
||||
profiles = Profile.objects.filter(secret_token=secret_token)
|
||||
if not profiles:
|
||||
return
|
||||
profile = profiles[0]
|
||||
except Profile.DoesNotExist:
|
||||
return
|
||||
|
||||
return profile.user
|
||||
|
||||
def _feed_address(self, user, sender):
|
||||
return 'newsletter:%s:%s' % (user.pk, sender)
|
||||
|
||||
def _split_sender(self, sender):
|
||||
tokens = re.search('(.*?) <(.*?)@(.*?)>', sender)
|
||||
|
||||
if not tokens:
|
||||
name, domain = sender.split('@')
|
||||
return name, sender, domain
|
||||
|
||||
sender_name, sender_username, sender_domain = tokens.group(1), tokens.group(2), tokens.group(3)
|
||||
sender_name = sender_name.replace('"', '')
|
||||
|
||||
return sender_name, sender_username, sender_domain
|
||||
|
||||
def _get_content(self, params, force_plain=False):
|
||||
if 'body-enriched' in params and not force_plain:
|
||||
return params['body-enriched']
|
||||
if 'body-html' in params and not force_plain:
|
||||
return params['body-html']
|
||||
if 'stripped-html' in params and not force_plain:
|
||||
return params['stripped-html']
|
||||
if 'body-plain' in params:
|
||||
return linkify(linebreaks(params['body-plain']))
|
||||
|
||||
def _clean_content(self, content):
|
||||
original = content
|
||||
scrubber = Scrubber()
|
||||
content = scrubber.scrub(content)
|
||||
if len(content) < len(original)*0.01:
|
||||
content = original
|
||||
content = content.replace('!important', '')
|
||||
return content
|
||||
|
||||
def _publish_to_subscribers(self, feed):
|
||||
try:
|
||||
r = redis.Redis(connection_pool=settings.REDIS_PUBSUB_POOL)
|
||||
listeners_count = r.publish(str(feed.pk), 'story:new')
|
||||
if listeners_count:
|
||||
logging.debug(" ---> [%-30s] ~FMPublished to %s subscribers" % (feed.title[:30], listeners_count))
|
||||
except redis.ConnectionError:
|
||||
logging.debug(" ***> [%-30s] ~BMRedis is unavailable for real-time." % (feed.title[:30],))
|
||||
|
||||
|
16
apps/newsletters/tests.py
Normal file
|
@ -0,0 +1,16 @@
|
|||
"""
|
||||
This file demonstrates writing tests using the unittest module. These will pass
|
||||
when you run "manage.py test".
|
||||
|
||||
Replace this with more appropriate tests for your application.
|
||||
"""
|
||||
|
||||
from django.test import TestCase
|
||||
|
||||
|
||||
class SimpleTest(TestCase):
|
||||
def test_basic_addition(self):
|
||||
"""
|
||||
Tests that 1 + 1 always equals 2.
|
||||
"""
|
||||
self.assertEqual(1 + 1, 2)
|
7
apps/newsletters/urls.py
Normal file
|
@ -0,0 +1,7 @@
|
|||
from django.conf.urls import *
|
||||
from apps.newsletters import views
|
||||
|
||||
urlpatterns = patterns('',
|
||||
url(r'^receive/?$', views.newsletter_receive, name='newsletter-receive'),
|
||||
url(r'^story/(?P<story_hash>[\w:]+)/?$', views.newsletter_story, name='newsletter-story'),
|
||||
)
|
58
apps/newsletters/views.py
Normal file
0
apps/notifications/__init__.py
Normal file
289
apps/notifications/models.py
Normal file
|
@ -0,0 +1,289 @@
|
|||
import datetime
|
||||
import enum
|
||||
import redis
|
||||
import mongoengine as mongo
|
||||
import boto
|
||||
from django.conf import settings
|
||||
from django.contrib.auth.models import User
|
||||
from django.template.loader import render_to_string
|
||||
from django.core.mail import EmailMultiAlternatives
|
||||
# from django.utils.html import strip_tags
|
||||
from apps.rss_feeds.models import MStory, Feed
|
||||
from apps.reader.models import UserSubscription
|
||||
from apps.analyzer.models import MClassifierTitle, MClassifierAuthor, MClassifierFeed, MClassifierTag
|
||||
from apps.analyzer.models import compute_story_score
|
||||
from utils.story_functions import truncate_chars
|
||||
from utils import log as logging
|
||||
from utils import mongoengine_fields
|
||||
from HTMLParser import HTMLParser
|
||||
from vendor.apns import APNs, Payload
|
||||
from BeautifulSoup import BeautifulSoup
|
||||
import types
|
||||
|
||||
class NotificationFrequency(enum.Enum):
|
||||
immediately = 1
|
||||
hour_1 = 2
|
||||
hour_6 = 3
|
||||
hour_12 = 4
|
||||
hour_24 = 5
|
||||
|
||||
class MUserNotificationTokens(mongo.Document):
|
||||
'''A user's push notification tokens'''
|
||||
user_id = mongo.IntField()
|
||||
ios_tokens = mongo.ListField(mongo.StringField(max_length=1024))
|
||||
|
||||
meta = {
|
||||
'collection': 'notification_tokens',
|
||||
'indexes': [{'fields': ['user_id'],
|
||||
'unique': True,
|
||||
'types': False, }],
|
||||
'allow_inheritance': False,
|
||||
}
|
||||
|
||||
@classmethod
|
||||
def get_tokens_for_user(cls, user_id):
|
||||
try:
|
||||
tokens = cls.objects.get(user_id=user_id)
|
||||
except cls.DoesNotExist:
|
||||
tokens = cls.objects.create(user_id=user_id)
|
||||
|
||||
return tokens
|
||||
|
||||
class MUserFeedNotification(mongo.Document):
|
||||
'''A user's notifications of a single feed.'''
|
||||
user_id = mongo.IntField()
|
||||
feed_id = mongo.IntField()
|
||||
frequency = mongoengine_fields.IntEnumField(NotificationFrequency)
|
||||
is_focus = mongo.BooleanField()
|
||||
last_notification_date = mongo.DateTimeField(default=datetime.datetime.now)
|
||||
is_email = mongo.BooleanField()
|
||||
is_web = mongo.BooleanField()
|
||||
is_ios = mongo.BooleanField()
|
||||
is_android = mongo.BooleanField()
|
||||
ios_tokens = mongo.ListField(mongo.StringField(max_length=1024))
|
||||
|
||||
|
||||
meta = {
|
||||
'collection': 'notifications',
|
||||
'indexes': ['feed_id',
|
||||
{'fields': ['user_id', 'feed_id'],
|
||||
'unique': True,
|
||||
'types': False, }],
|
||||
'allow_inheritance': False,
|
||||
}
|
||||
|
||||
def __unicode__(self):
|
||||
notification_types = []
|
||||
if self.is_email: notification_types.append('email')
|
||||
if self.is_web: notification_types.append('web')
|
||||
if self.is_ios: notification_types.append('ios')
|
||||
if self.is_android: notification_types.append('android')
|
||||
|
||||
return "%s/%s: %s -> %s" % (
|
||||
User.objects.get(pk=self.user_id).username,
|
||||
Feed.get_feed_by_id(self.feed_id),
|
||||
','.join(notification_types),
|
||||
self.last_notification_date,
|
||||
)
|
||||
|
||||
@classmethod
|
||||
def feed_has_users(cls, feed_id):
|
||||
return cls.users_for_feed(feed_id).count()
|
||||
|
||||
@classmethod
|
||||
def users_for_feed(cls, feed_id):
|
||||
notifications = cls.objects.filter(feed_id=feed_id)
|
||||
|
||||
return notifications
|
||||
|
||||
@classmethod
|
||||
def feeds_for_user(cls, user_id):
|
||||
notifications = cls.objects.filter(user_id=user_id)
|
||||
notifications_by_feed = {}
|
||||
|
||||
for feed in notifications:
|
||||
notifications_by_feed[feed.feed_id] = {
|
||||
'notification_types': [],
|
||||
'notification_filter': "focus" if feed.is_focus else "unread",
|
||||
}
|
||||
if feed.is_email: notifications_by_feed[feed.feed_id]['notification_types'].append('email')
|
||||
if feed.is_web: notifications_by_feed[feed.feed_id]['notification_types'].append('web')
|
||||
if feed.is_ios: notifications_by_feed[feed.feed_id]['notification_types'].append('ios')
|
||||
if feed.is_android: notifications_by_feed[feed.feed_id]['notification_types'].append('android')
|
||||
|
||||
return notifications_by_feed
|
||||
|
||||
@classmethod
|
||||
def push_feed_notifications(cls, feed_id, new_stories, force=False):
|
||||
feed = Feed.get_by_id(feed_id)
|
||||
notifications = MUserFeedNotification.users_for_feed(feed.pk)
|
||||
logging.debug(" ---> [%-30s] ~FCPushing out notifications to ~SB%s users~SN for ~FB~SB%s stories" % (
|
||||
feed, len(notifications), new_stories))
|
||||
r = redis.Redis(connection_pool=settings.REDIS_STORY_HASH_POOL)
|
||||
|
||||
latest_story_hashes = r.zrange("zF:%s" % feed.pk, -1 * new_stories, -1)
|
||||
mstories = MStory.objects.filter(story_hash__in=latest_story_hashes).order_by('-story_date')
|
||||
stories = Feed.format_stories(mstories)
|
||||
total_sent_count = 0
|
||||
|
||||
for user_feed_notification in notifications:
|
||||
sent_count = 0
|
||||
last_notification_date = user_feed_notification.last_notification_date
|
||||
try:
|
||||
usersub = UserSubscription.objects.get(user=user_feed_notification.user_id,
|
||||
feed=user_feed_notification.feed_id)
|
||||
except UserSubscription.DoesNotExist:
|
||||
continue
|
||||
classifiers = user_feed_notification.classifiers(usersub)
|
||||
|
||||
if classifiers == None:
|
||||
logging.debug("Has no usersubs")
|
||||
continue
|
||||
|
||||
for story in stories:
|
||||
if sent_count >= 3:
|
||||
logging.debug("Sent too many, ignoring...")
|
||||
continue
|
||||
if story['story_date'] < last_notification_date and not force:
|
||||
logging.debug("Story date older than last notification date: %s < %s" % (story['story_date'], last_notification_date))
|
||||
continue
|
||||
|
||||
if story['story_date'] > user_feed_notification.last_notification_date:
|
||||
user_feed_notification.last_notification_date = story['story_date']
|
||||
user_feed_notification.save()
|
||||
|
||||
story['story_content'] = HTMLParser().unescape(story['story_content'])
|
||||
|
||||
sent = user_feed_notification.push_story_notification(story, classifiers, usersub)
|
||||
if sent:
|
||||
sent_count += 1
|
||||
total_sent_count += 1
|
||||
return total_sent_count, len(notifications)
|
||||
|
||||
def classifiers(self, usersub):
|
||||
classifiers = {}
|
||||
if usersub.is_trained:
|
||||
classifiers['feeds'] = list(MClassifierFeed.objects(user_id=self.user_id, feed_id=self.feed_id,
|
||||
social_user_id=0))
|
||||
classifiers['authors'] = list(MClassifierAuthor.objects(user_id=self.user_id, feed_id=self.feed_id))
|
||||
classifiers['titles'] = list(MClassifierTitle.objects(user_id=self.user_id, feed_id=self.feed_id))
|
||||
classifiers['tags'] = list(MClassifierTag.objects(user_id=self.user_id, feed_id=self.feed_id))
|
||||
|
||||
return classifiers
|
||||
|
||||
def title_and_body(self, story, usersub):
|
||||
def replace_with_newlines(element):
|
||||
text = ''
|
||||
for elem in element.recursiveChildGenerator():
|
||||
if isinstance(elem, types.StringTypes):
|
||||
text += elem
|
||||
elif elem.name == 'br':
|
||||
text += '\n'
|
||||
elif elem.name == 'p':
|
||||
text += '\n\n'
|
||||
return text
|
||||
|
||||
feed_title = usersub.user_title or usersub.feed.feed_title
|
||||
# title = "%s: %s" % (feed_title, story['story_title'])
|
||||
title = feed_title
|
||||
subtitle = story['story_title']
|
||||
# body = HTMLParser().unescape(strip_tags(story['story_content']))
|
||||
soup = BeautifulSoup(story['story_content'].strip())
|
||||
# print story['story_content']
|
||||
body = replace_with_newlines(soup)
|
||||
body = truncate_chars(body.strip(), 1600)
|
||||
|
||||
return title, subtitle, body
|
||||
|
||||
def push_story_notification(self, story, classifiers, usersub):
|
||||
story_score = self.story_score(story, classifiers)
|
||||
if self.is_focus and story_score <= 0:
|
||||
logging.debug("Is focus, but story is hidden")
|
||||
return False
|
||||
elif story_score < 0:
|
||||
logging.debug("Is unread, but story is hidden")
|
||||
return False
|
||||
|
||||
user = User.objects.get(pk=self.user_id)
|
||||
logging.user(user, "~FCSending push notification: %s/%s (score: %s)" % (story['story_title'][:40], story['story_hash'], story_score))
|
||||
|
||||
self.send_web(story, user)
|
||||
self.send_ios(story, user, usersub)
|
||||
self.send_android(story)
|
||||
self.send_email(story, usersub)
|
||||
|
||||
return True
|
||||
|
||||
def send_web(self, story, user):
|
||||
if not self.is_web: return
|
||||
|
||||
r = redis.Redis(connection_pool=settings.REDIS_PUBSUB_POOL)
|
||||
r.publish(user.username, 'notification:%s,%s' % (story['story_hash'], story['story_title']))
|
||||
|
||||
def send_ios(self, story, user, usersub):
|
||||
if not self.is_ios: return
|
||||
|
||||
apns = APNs(use_sandbox=True,
|
||||
cert_file='/srv/newsblur/config/certificates/aps_development.pem',
|
||||
key_file='/srv/newsblur/config/certificates/aps_development.pem')
|
||||
|
||||
tokens = MUserNotificationTokens.get_tokens_for_user(self.user_id)
|
||||
title, subtitle, body = self.title_and_body(story, usersub)
|
||||
image_url = None
|
||||
if len(story['image_urls']):
|
||||
image_url = story['image_urls'][0]
|
||||
# print image_url
|
||||
|
||||
for token in tokens.ios_tokens:
|
||||
logging.user(user, '~BMStory notification by iOS: ~FY~SB%s~SN~BM~FY/~SB%s' %
|
||||
(story['story_title'][:50], usersub.feed.feed_title[:50]))
|
||||
payload = Payload(alert={'title': title,
|
||||
'subtitle': subtitle,
|
||||
'body': body},
|
||||
category="STORY_CATEGORY",
|
||||
mutable_content=True,
|
||||
custom={'story_hash': story['story_hash'],
|
||||
'story_feed_id': story['story_feed_id'],
|
||||
'image_url': image_url,
|
||||
})
|
||||
apns.gateway_server.send_notification(token, payload)
|
||||
|
||||
def send_android(self, story):
|
||||
if not self.is_android: return
|
||||
|
||||
|
||||
def send_email(self, story, usersub):
|
||||
if not self.is_email: return
|
||||
feed = usersub.feed
|
||||
|
||||
params = {
|
||||
"story": story,
|
||||
"feed": feed,
|
||||
"feed_title": usersub.user_title or feed.feed_title,
|
||||
"favicon_border": feed.favicon_color,
|
||||
}
|
||||
from_address = 'share@newsblur.com'
|
||||
to_address = '%s <%s>' % (usersub.user.username, usersub.user.email)
|
||||
text = render_to_string('mail/email_story_notification.txt', params)
|
||||
html = render_to_string('mail/email_story_notification.xhtml', params)
|
||||
subject = '%s: %s' % (usersub.user_title or usersub.feed.feed_title, story['story_title'])
|
||||
subject = subject.replace('\n', ' ')
|
||||
msg = EmailMultiAlternatives(subject, text,
|
||||
from_email='NewsBlur <%s>' % from_address,
|
||||
to=[to_address])
|
||||
msg.attach_alternative(html, "text/html")
|
||||
try:
|
||||
msg.send()
|
||||
except boto.ses.connection.ResponseError, e:
|
||||
logging.user(usersub.user, '~BMStory notification by email error: ~FR%s' % e)
|
||||
logging.user(usersub.user, '~BMStory notification by email: ~FY~SB%s~SN~BM~FY/~SB%s' %
|
||||
(story['story_title'][:50], usersub.feed.feed_title[:50]))
|
||||
|
||||
def story_score(self, story, classifiers):
|
||||
score = compute_story_score(story, classifier_titles=classifiers.get('titles', []),
|
||||
classifier_authors=classifiers.get('authors', []),
|
||||
classifier_tags=classifiers.get('tags', []),
|
||||
classifier_feeds=classifiers.get('feeds', []))
|
||||
|
||||
return score
|
||||
|
10
apps/notifications/tasks.py
Normal file
|
@ -0,0 +1,10 @@
|
|||
from celery.task import Task
|
||||
from django.contrib.auth.models import User
|
||||
from apps.notifications.models import MUserFeedNotification
|
||||
from utils import log as logging
|
||||
|
||||
|
||||
class QueueNotifications(Task):
|
||||
|
||||
def run(self, feed_id, new_stories):
|
||||
MUserFeedNotification.push_feed_notifications(feed_id, new_stories)
|
16
apps/notifications/tests.py
Normal file
|
@ -0,0 +1,16 @@
|
|||
"""
|
||||
This file demonstrates writing tests using the unittest module. These will pass
|
||||
when you run "manage.py test".
|
||||
|
||||
Replace this with more appropriate tests for your application.
|
||||
"""
|
||||
|
||||
from django.test import TestCase
|
||||
|
||||
|
||||
class SimpleTest(TestCase):
|
||||
def test_basic_addition(self):
|
||||
"""
|
||||
Tests that 1 + 1 always equals 2.
|
||||
"""
|
||||
self.assertEqual(1 + 1, 2)
|
11
apps/notifications/urls.py
Normal file
|
@ -0,0 +1,11 @@
|
|||
from django.conf.urls import url, patterns
|
||||
from apps.notifications import views
|
||||
from oauth2_provider import views as op_views
|
||||
|
||||
urlpatterns = patterns('',
|
||||
url(r'^$', views.notifications_by_feed, name='notifications-by-feed'),
|
||||
url(r'^feed/?$', views.set_notifications_for_feed, name='set-notifications-for-feed'),
|
||||
url(r'^apns_token/?$', views.set_apns_token, name='set-apns-token'),
|
||||
url(r'^android_token/?$', views.set_android_token, name='set-android-token'),
|
||||
url(r'^force_push/?$', views.force_push, name='force-push-notification'),
|
||||
)
|
100
apps/notifications/views.py
Normal file
|
@ -0,0 +1,100 @@
|
|||
import redis
|
||||
from django.conf import settings
|
||||
from django.contrib.admin.views.decorators import staff_member_required
|
||||
from utils import json_functions as json
|
||||
from utils.user_functions import get_user, ajax_login_required
|
||||
from apps.notifications.models import MUserFeedNotification, MUserNotificationTokens
|
||||
from apps.rss_feeds.models import Feed
|
||||
from utils.view_functions import required_params
|
||||
from utils import log as logging
|
||||
|
||||
|
||||
@ajax_login_required
|
||||
@json.json_view
|
||||
def notifications_by_feed(request):
|
||||
user = get_user(request)
|
||||
notifications_by_feed = MUserFeedNotification.feeds_for_user(user.pk)
|
||||
|
||||
return notifications_by_feed
|
||||
|
||||
@ajax_login_required
|
||||
@json.json_view
|
||||
def set_notifications_for_feed(request):
|
||||
user = get_user(request)
|
||||
feed_id = request.POST['feed_id']
|
||||
notification_types = request.POST.getlist('notification_types')
|
||||
notification_filter = request.POST.get('notification_filter')
|
||||
|
||||
try:
|
||||
notification = MUserFeedNotification.objects.get(user_id=user.pk, feed_id=feed_id)
|
||||
except MUserFeedNotification.DoesNotExist:
|
||||
params = {
|
||||
"user_id": user.pk,
|
||||
"feed_id": feed_id,
|
||||
}
|
||||
notification = MUserFeedNotification.objects.create(**params)
|
||||
|
||||
web_was_off = not notification.is_web
|
||||
notification.is_focus = bool(notification_filter == "focus")
|
||||
notification.is_email = bool('email' in notification_types)
|
||||
notification.is_ios = bool('ios' in notification_types)
|
||||
notification.is_android = bool('android' in notification_types)
|
||||
notification.is_web = bool('web' in notification_types)
|
||||
notification.save()
|
||||
|
||||
if (not notification.is_email and
|
||||
not notification.is_ios and
|
||||
not notification.is_android and
|
||||
not notification.is_web):
|
||||
notification.delete()
|
||||
|
||||
r = redis.Redis(connection_pool=settings.REDIS_PUBSUB_POOL)
|
||||
if web_was_off and notification.is_web:
|
||||
r.publish(user.username, 'notification:setup:%s' % feed_id)
|
||||
|
||||
notifications_by_feed = MUserFeedNotification.feeds_for_user(user.pk)
|
||||
|
||||
return {"notifications_by_feed": notifications_by_feed}
|
||||
|
||||
@ajax_login_required
|
||||
@json.json_view
|
||||
def set_apns_token(request):
|
||||
user = get_user(request)
|
||||
tokens = MUserNotificationTokens.get_tokens_for_user(user.pk)
|
||||
apns_token = request.REQUEST['apns_token']
|
||||
|
||||
logging.user(user, "~FCUpdating APNS push token")
|
||||
if apns_token not in tokens.ios_tokens:
|
||||
tokens.ios_tokens.append(apns_token)
|
||||
tokens.save()
|
||||
return {'message': 'Token saved.'}
|
||||
|
||||
return {'message': 'Token already saved.'}
|
||||
|
||||
@ajax_login_required
|
||||
@json.json_view
|
||||
def set_android_token(request):
|
||||
user = get_user(request)
|
||||
tokens = MUserNotificationTokens.get_tokens_for_user(user.pk)
|
||||
token = request.REQUEST['token']
|
||||
|
||||
logging.user(user, "~FCUpdating Android push token")
|
||||
if token not in tokens.android_tokens:
|
||||
tokens.android_tokens.append(token)
|
||||
tokens.save()
|
||||
return {'message': 'Token saved.'}
|
||||
|
||||
return {'message': 'Token already saved.'}
|
||||
|
||||
@required_params(feed_id=int)
|
||||
@staff_member_required
|
||||
@json.json_view
|
||||
def force_push(request):
|
||||
user = get_user(request)
|
||||
feed_id = request.REQUEST['feed_id']
|
||||
count = int(request.REQUEST.get('count', 1))
|
||||
|
||||
logging.user(user, "~BM~FWForce pushing %s stories: ~SB%s" % (count, Feed.get_by_id(feed_id)))
|
||||
sent_count, user_count = MUserFeedNotification.push_feed_notifications(feed_id, new_stories=count, force=True)
|
||||
|
||||
return {"message": "Pushed %s notifications to %s users" % (sent_count, user_count)}
|
|
@ -2,6 +2,7 @@ import urllib
|
|||
import urlparse
|
||||
import datetime
|
||||
import lxml.html
|
||||
import tweepy
|
||||
from django.contrib.auth.decorators import login_required
|
||||
from django.core.urlresolvers import reverse
|
||||
from django.contrib.auth.models import User
|
||||
|
@ -23,7 +24,6 @@ from utils.view_functions import render_to
|
|||
from utils import urlnorm
|
||||
from utils import json_functions as json
|
||||
from vendor import facebook
|
||||
from vendor import tweepy
|
||||
from vendor import appdotnet
|
||||
|
||||
@login_required
|
||||
|
@ -41,12 +41,13 @@ def twitter_connect(request):
|
|||
elif oauth_token and oauth_verifier:
|
||||
try:
|
||||
auth = tweepy.OAuthHandler(twitter_consumer_key, twitter_consumer_secret)
|
||||
auth.set_request_token(oauth_token, oauth_verifier)
|
||||
access_token = auth.get_access_token(oauth_verifier)
|
||||
auth.request_token = request.session['twitter_request_token']
|
||||
# auth.set_request_token(oauth_token, oauth_verifier)
|
||||
auth.get_access_token(oauth_verifier)
|
||||
api = tweepy.API(auth)
|
||||
twitter_user = api.me()
|
||||
except (tweepy.TweepError, IOError):
|
||||
logging.user(request, "~BB~FRFailed Twitter connect")
|
||||
except (tweepy.TweepError, IOError), e:
|
||||
logging.user(request, "~BB~FRFailed Twitter connect: %s" % e)
|
||||
return dict(error="Twitter has returned an error. Try connecting again.")
|
||||
|
||||
# Be sure that two people aren't using the same Twitter account.
|
||||
|
@ -63,8 +64,8 @@ def twitter_connect(request):
|
|||
|
||||
social_services = MSocialServices.get_user(request.user.pk)
|
||||
social_services.twitter_uid = unicode(twitter_user.id)
|
||||
social_services.twitter_access_key = access_token.key
|
||||
social_services.twitter_access_secret = access_token.secret
|
||||
social_services.twitter_access_key = auth.access_token
|
||||
social_services.twitter_access_secret = auth.access_token_secret
|
||||
social_services.syncing_twitter = True
|
||||
social_services.save()
|
||||
|
||||
|
@ -76,7 +77,8 @@ def twitter_connect(request):
|
|||
# Start the OAuth process
|
||||
auth = tweepy.OAuthHandler(twitter_consumer_key, twitter_consumer_secret)
|
||||
auth_url = auth.get_authorization_url()
|
||||
logging.user(request, "~BB~FRStarting Twitter connect")
|
||||
request.session['twitter_request_token'] = auth.request_token
|
||||
logging.user(request, "~BB~FRStarting Twitter connect: %s" % auth.request_token)
|
||||
return {'next': auth_url}
|
||||
|
||||
|
||||
|
@ -631,6 +633,7 @@ def api_share_new_story(request):
|
|||
story_author = fields.get('story_author', "")
|
||||
comments = fields.get('comments', None)
|
||||
|
||||
logging.user(request.user, "~FBFinding feed (api_share_new_story): %s" % story_url)
|
||||
original_feed = Feed.get_feed_from_url(story_url, create=True, fetch=True)
|
||||
story_hash = MStory.guid_hash_unsaved(story_url)
|
||||
if not user.profile.is_premium and MSharedStory.feed_quota(user.pk, original_feed and original_feed.pk or 0, story_hash):
|
||||
|
@ -664,7 +667,7 @@ def api_share_new_story(request):
|
|||
"story_title": story_title and story_title[:title_max] or "[Untitled]",
|
||||
"story_feed_id": original_feed and original_feed.pk or 0,
|
||||
"story_content": story_content,
|
||||
"story_author": story_author,
|
||||
"story_author_name": story_author,
|
||||
"story_date": datetime.datetime.now(),
|
||||
"user_id": user.pk,
|
||||
"comments": comments,
|
||||
|
@ -716,6 +719,7 @@ def api_save_new_story(request):
|
|||
user_tags = fields.get('user_tags', "")
|
||||
story = None
|
||||
|
||||
logging.user(request.user, "~FBFinding feed (api_save_new_story): %s" % story_url)
|
||||
original_feed = Feed.get_feed_from_url(story_url)
|
||||
if not story_content or not story_title:
|
||||
ti = TextImporter(feed=original_feed, story_url=story_url, request=request)
|
||||
|
|
|
@ -162,7 +162,7 @@ class Profile(models.Model):
|
|||
|
||||
def activate_premium(self, never_expire=False):
|
||||
from apps.profile.tasks import EmailNewPremium
|
||||
|
||||
|
||||
EmailNewPremium.delay(user_id=self.user.pk)
|
||||
|
||||
self.is_premium = True
|
||||
|
@ -357,9 +357,10 @@ class Profile(models.Model):
|
|||
stripe_cancel = self.cancel_premium_stripe()
|
||||
return paypal_cancel or stripe_cancel
|
||||
|
||||
def cancel_premium_paypal(self):
|
||||
def cancel_premium_paypal(self, second_most_recent_only=False):
|
||||
transactions = PayPalIPN.objects.filter(custom=self.user.username,
|
||||
txn_type='subscr_signup')
|
||||
txn_type='subscr_signup').order_by('-subscr_date')
|
||||
|
||||
if not transactions:
|
||||
return
|
||||
|
||||
|
@ -371,14 +372,24 @@ class Profile(models.Model):
|
|||
'API_CA_CERTS': False,
|
||||
}
|
||||
paypal = PayPalInterface(**paypal_opts)
|
||||
transaction = transactions[0]
|
||||
if second_most_recent_only:
|
||||
# Check if user has an active subscription. If so, cancel it because a new one came in.
|
||||
if len(transactions) > 1:
|
||||
transaction = transactions[1]
|
||||
else:
|
||||
return False
|
||||
else:
|
||||
transaction = transactions[0]
|
||||
profileid = transaction.subscr_id
|
||||
try:
|
||||
paypal.manage_recurring_payments_profile_status(profileid=profileid, action='Cancel')
|
||||
except PayPalAPIResponseError:
|
||||
logging.user(self.user, "~FRUser ~SBalready~SN canceled Paypal subscription")
|
||||
logging.user(self.user, "~FRUser ~SBalready~SN canceled Paypal subscription: %s" % profileid)
|
||||
else:
|
||||
logging.user(self.user, "~FRCanceling Paypal subscription")
|
||||
if second_most_recent_only:
|
||||
logging.user(self.user, "~FRCanceling ~BR~FWsecond-oldest~SB~FR Paypal subscription: %s" % profileid)
|
||||
else:
|
||||
logging.user(self.user, "~FRCanceling Paypal subscription: %s" % profileid)
|
||||
|
||||
return True
|
||||
|
||||
|
@ -773,6 +784,36 @@ NewsBlur""" % {'user': self.user.username, 'feeds': subs.count()}
|
|||
|
||||
logging.user(self.user, "~BB~FM~SBSending launch social email for user: %s months, %s" % (months_ago, self.user.email))
|
||||
|
||||
def send_launch_turntouch_email(self, force=False):
|
||||
if not self.user.email or not self.send_emails:
|
||||
logging.user(self.user, "~FM~SB~FRNot~FM sending launch TT email for user, %s: %s" % (self.user.email and 'opt-out: ' or 'blank', self.user.email))
|
||||
return
|
||||
|
||||
params = dict(receiver_user_id=self.user.pk, email_type='launch_turntouch')
|
||||
try:
|
||||
sent_email = MSentEmail.objects.get(**params)
|
||||
if not force:
|
||||
# Return if email already sent
|
||||
logging.user(self.user, "~FM~SB~FRNot~FM sending launch social email for user, sent already: %s" % self.user.email)
|
||||
return
|
||||
except MSentEmail.DoesNotExist:
|
||||
sent_email = MSentEmail.objects.create(**params)
|
||||
|
||||
delta = datetime.datetime.now() - self.last_seen_on
|
||||
months_ago = delta.days / 30
|
||||
user = self.user
|
||||
data = dict(user=user, months_ago=months_ago)
|
||||
text = render_to_string('mail/email_launch_turntouch.txt', data)
|
||||
html = render_to_string('mail/email_launch_turntouch.xhtml', data)
|
||||
subject = "Introducing Turn Touch for NewsBlur"
|
||||
msg = EmailMultiAlternatives(subject, text,
|
||||
from_email='NewsBlur <%s>' % settings.HELLO_EMAIL,
|
||||
to=['%s <%s>' % (user, user.email)])
|
||||
msg.attach_alternative(html, "text/html")
|
||||
msg.send(fail_silently=True)
|
||||
|
||||
logging.user(self.user, "~BB~FM~SBSending launch TT email for user: %s months, %s" % (months_ago, self.user.email))
|
||||
|
||||
def grace_period_email_sent(self, force=False):
|
||||
emails_sent = MSentEmail.objects.filter(receiver_user_id=self.user.pk,
|
||||
email_type='premium_expire_grace')
|
||||
|
@ -883,6 +924,8 @@ def paypal_signup(sender, **kwargs):
|
|||
except:
|
||||
pass
|
||||
user.profile.activate_premium()
|
||||
user.profile.cancel_premium_stripe()
|
||||
user.profile.cancel_premium_paypal(second_most_recent_only=True)
|
||||
subscription_signup.connect(paypal_signup)
|
||||
|
||||
def paypal_payment_history_sync(sender, **kwargs):
|
||||
|
@ -931,6 +974,7 @@ def stripe_signup(sender, full_json, **kwargs):
|
|||
profile = Profile.objects.get(stripe_id=stripe_id)
|
||||
logging.user(profile.user, "~BC~SB~FBStripe subscription signup")
|
||||
profile.activate_premium()
|
||||
profile.cancel_premium_paypal()
|
||||
except Profile.DoesNotExist:
|
||||
return {"code": -1, "message": "User doesn't exist."}
|
||||
zebra_webhook_customer_subscription_created.connect(stripe_signup)
|
||||
|
@ -977,7 +1021,39 @@ def blank_authenticate(username, password=""):
|
|||
encoded_username = authenticate(username=username, password=username)
|
||||
if encoded_blank == hash or encoded_username == user:
|
||||
return user
|
||||
|
||||
|
||||
# Unfinished
|
||||
class MEmailUnsubscribe(mongo.Document):
|
||||
user_id = mongo.IntField()
|
||||
email_type = mongo.StringField()
|
||||
date = mongo.DateTimeField(default=datetime.datetime.now)
|
||||
|
||||
EMAIL_TYPE_FOLLOWS = 'follows'
|
||||
EMAIL_TYPE_REPLIES = 'replies'
|
||||
EMAIL_TYOE_PRODUCT = 'product'
|
||||
|
||||
meta = {
|
||||
'collection': 'email_unsubscribes',
|
||||
'allow_inheritance': False,
|
||||
'indexes': ['user_id',
|
||||
{'fields': ['user_id', 'email_type'],
|
||||
'unique': True,
|
||||
'types': False}],
|
||||
}
|
||||
|
||||
def __unicode__(self):
|
||||
return "%s unsubscribed from %s on %s" % (self.user_id, self.email_type, self.date)
|
||||
|
||||
@classmethod
|
||||
def user(cls, user_id):
|
||||
unsubs = cls.objects(user_id=user_id)
|
||||
return unsubs
|
||||
|
||||
@classmethod
|
||||
def unsubscribe(cls, user_id, email_type):
|
||||
cls.objects.create()
|
||||
|
||||
|
||||
class MSentEmail(mongo.Document):
|
||||
sending_user_id = mongo.IntField()
|
||||
receiver_user_id = mongo.IntField()
|
||||
|
@ -1019,7 +1095,7 @@ class PaymentHistory(models.Model):
|
|||
}
|
||||
|
||||
@classmethod
|
||||
def report(cls, months=25):
|
||||
def report(cls, months=26):
|
||||
def _counter(start_date, end_date):
|
||||
payments = PaymentHistory.objects.filter(payment_date__gte=start_date, payment_date__lte=end_date)
|
||||
payments = payments.aggregate(avg=Avg('payment_amount'),
|
||||
|
@ -1028,7 +1104,7 @@ class PaymentHistory(models.Model):
|
|||
print "%s-%02d-%02d - %s-%02d-%02d:\t$%.2f\t$%-6s\t%-4s" % (
|
||||
start_date.year, start_date.month, start_date.day,
|
||||
end_date.year, end_date.month, end_date.day,
|
||||
round(payments['avg'], 2), payments['sum'], payments['count'])
|
||||
round(payments['avg'] if payments['avg'] else 0, 2), payments['sum'] if payments['sum'] else 0, payments['count'])
|
||||
return payments['sum']
|
||||
|
||||
print "\nMonthly Totals:"
|
||||
|
@ -1041,14 +1117,44 @@ class PaymentHistory(models.Model):
|
|||
total = _counter(start_date, end_date)
|
||||
month_totals[start_date.strftime("%Y-%m")] = total
|
||||
|
||||
print "\nCurrent Month Totals:"
|
||||
month_totals = {}
|
||||
years = datetime.datetime.now().year - 2009
|
||||
for y in reversed(range(years)):
|
||||
now = datetime.datetime.now()
|
||||
start_date = datetime.datetime(now.year, now.month, 1) - dateutil.relativedelta.relativedelta(years=y)
|
||||
end_time = start_date + datetime.timedelta(days=31)
|
||||
end_date = datetime.datetime(end_time.year, end_time.month, 1) - datetime.timedelta(seconds=1)
|
||||
if end_date > now: end_date = now
|
||||
month_totals[start_date.strftime("%Y-%m")] = _counter(start_date, end_date)
|
||||
|
||||
print "\nMTD Totals:"
|
||||
month_totals = {}
|
||||
years = datetime.datetime.now().year - 2009
|
||||
for y in reversed(range(years)):
|
||||
now = datetime.datetime.now()
|
||||
start_date = datetime.datetime(now.year, now.month, 1) - dateutil.relativedelta.relativedelta(years=y)
|
||||
end_date = now - dateutil.relativedelta.relativedelta(years=y)
|
||||
if end_date > now: end_date = now
|
||||
month_totals[start_date.strftime("%Y-%m")] = _counter(start_date, end_date)
|
||||
|
||||
print "\nYearly Totals:"
|
||||
year_totals = {}
|
||||
years = datetime.datetime.now().year - 2009
|
||||
for y in reversed(range(years)):
|
||||
now = datetime.datetime.now()
|
||||
start_date = datetime.datetime(now.year, 1, 1) - dateutil.relativedelta.relativedelta(years=y)
|
||||
end_time = start_date + datetime.timedelta(days=365)
|
||||
end_date = datetime.datetime(end_time.year, end_time.month, 30) - datetime.timedelta(seconds=1)
|
||||
end_date = datetime.datetime(now.year, 1, 1) - dateutil.relativedelta.relativedelta(years=y-1) - datetime.timedelta(seconds=1)
|
||||
if end_date > now: end_date = now
|
||||
year_totals[now.year - y] = _counter(start_date, end_date)
|
||||
|
||||
print "\nYTD Totals:"
|
||||
year_totals = {}
|
||||
years = datetime.datetime.now().year - 2009
|
||||
for y in reversed(range(years)):
|
||||
now = datetime.datetime.now()
|
||||
start_date = datetime.datetime(now.year, 1, 1) - dateutil.relativedelta.relativedelta(years=y)
|
||||
end_date = now - dateutil.relativedelta.relativedelta(years=y)
|
||||
if end_date > now: end_date = now
|
||||
year_totals[now.year - y] = _counter(start_date, end_date)
|
||||
|
||||
|
|
|
@ -3,7 +3,7 @@ from celery.task import Task
|
|||
from apps.profile.models import Profile, RNewUserQueue
|
||||
from utils import log as logging
|
||||
from apps.reader.models import UserSubscription
|
||||
from apps.social.models import MSocialServices
|
||||
from apps.social.models import MSocialServices, MActivity, MInteraction
|
||||
|
||||
class EmailNewUser(Task):
|
||||
|
||||
|
@ -60,6 +60,8 @@ class CleanupUser(Task):
|
|||
UserSubscription.trim_user_read_stories(user_id)
|
||||
UserSubscription.verify_feeds_scheduled(user_id)
|
||||
Profile.count_all_feed_subscribers_for_user(user_id)
|
||||
MInteraction.trim(user_id)
|
||||
MActivity.trim(user_id)
|
||||
# UserSubscription.refresh_stale_feeds(user_id)
|
||||
|
||||
try:
|
||||
|
|
|
@ -34,7 +34,7 @@ from vendor.paypal.standard.forms import PayPalPaymentsForm
|
|||
|
||||
SINGLE_FIELD_PREFS = ('timezone','feed_pane_size','hide_mobile','send_emails',
|
||||
'hide_getting_started', 'has_setup_feeds', 'has_found_friends',
|
||||
'has_trained_intelligence',)
|
||||
'has_trained_intelligence')
|
||||
SPECIAL_PREFERENCES = ('old_password', 'new_password', 'autofollow_friends', 'dashboard_date',)
|
||||
|
||||
@ajax_login_required
|
||||
|
@ -419,10 +419,14 @@ def payment_history(request):
|
|||
"read_story_count": RUserStory.read_story_count(user.pk),
|
||||
"feed_opens": UserSubscription.objects.filter(user=user).aggregate(sum=Sum('feed_opens'))['sum'],
|
||||
"training": {
|
||||
'title': MClassifierTitle.objects.filter(user_id=user.pk).count(),
|
||||
'tag': MClassifierTag.objects.filter(user_id=user.pk).count(),
|
||||
'author': MClassifierAuthor.objects.filter(user_id=user.pk).count(),
|
||||
'feed': MClassifierFeed.objects.filter(user_id=user.pk).count(),
|
||||
'title_ps': MClassifierTitle.objects.filter(user_id=user.pk, score__gt=0).count(),
|
||||
'title_ng': MClassifierTitle.objects.filter(user_id=user.pk, score__lt=0).count(),
|
||||
'tag_ps': MClassifierTag.objects.filter(user_id=user.pk, score__gt=0).count(),
|
||||
'tag_ng': MClassifierTag.objects.filter(user_id=user.pk, score__lt=0).count(),
|
||||
'author_ps': MClassifierAuthor.objects.filter(user_id=user.pk, score__gt=0).count(),
|
||||
'author_ng': MClassifierAuthor.objects.filter(user_id=user.pk, score__lt=0).count(),
|
||||
'feed_ps': MClassifierFeed.objects.filter(user_id=user.pk, score__gt=0).count(),
|
||||
'feed_ng': MClassifierFeed.objects.filter(user_id=user.pk, score__lt=0).count(),
|
||||
}
|
||||
}
|
||||
|
||||
|
|
|
@ -68,7 +68,7 @@ class PushSubscriptionManager(models.Manager):
|
|||
elif response and response.status_code == 202: # async verification
|
||||
subscription.verified = False
|
||||
else:
|
||||
error = response and response.content or ""
|
||||
error = response and response.text or ""
|
||||
if not force_retry and 'You may only subscribe to' in error:
|
||||
extracted_topic = re.search("You may only subscribe to (.*?) ", error)
|
||||
if extracted_topic:
|
||||
|
@ -76,7 +76,7 @@ class PushSubscriptionManager(models.Manager):
|
|||
feed=feed, hub=hub, force_retry=True)
|
||||
else:
|
||||
logging.debug(u' ---> [%-30s] ~FR~BKFeed failed to subscribe to push: %s (code: %s)' % (
|
||||
unicode(subscription.feed)[:30], error, response and response.status_code))
|
||||
unicode(subscription.feed)[:30], error[:100], response and response.status_code))
|
||||
|
||||
subscription.save()
|
||||
feed.setup_push()
|
||||
|
@ -141,7 +141,10 @@ class PushSubscription(models.Model):
|
|||
hub_url = link['href']
|
||||
elif link['rel'] == 'self':
|
||||
self_url = link['href']
|
||||
|
||||
|
||||
if hub_url and hub_url.startswith('//'):
|
||||
hub_url = "http:%s" % hub_url
|
||||
|
||||
needs_update = False
|
||||
if hub_url and self.hub != hub_url:
|
||||
# hub URL has changed; let's update our subscription
|
||||
|
|
|
@ -21,6 +21,14 @@
|
|||
"user": 1
|
||||
}
|
||||
},
|
||||
{
|
||||
"pk": 2,
|
||||
"model": "reader.usersubscriptionfolders",
|
||||
"fields": {
|
||||
"folders": "[5299728, 644144, 1187026, {\"Brainiacs & Opinion\": [569, 38, 3581, 183139, 1186180, 15]}, {\"Science & Technology\": [731503, 140145, 1272495, 76, 161, 39, {\"Hacker\": [5985150, 3323431]}]}, {\"Humor\": [212379, 3530, 5994357]}, {\"Videos\": [3240, 5168]}]",
|
||||
"user": 2
|
||||
}
|
||||
},
|
||||
|
||||
{
|
||||
"pk": 2,
|
||||
|
@ -161,6 +169,24 @@
|
|||
"email": "samuel@newsblur.com",
|
||||
"date_joined": "2009-01-04 17:32:58"
|
||||
}
|
||||
},
|
||||
{
|
||||
"pk": 2,
|
||||
"model": "auth.user",
|
||||
"fields": {
|
||||
"username": "Dejal",
|
||||
"first_name": "",
|
||||
"last_name": "",
|
||||
"is_active": 1,
|
||||
"is_superuser": 1,
|
||||
"is_staff": 1,
|
||||
"last_login": "2009-04-07 19:22:24",
|
||||
"groups": [],
|
||||
"user_permissions": [],
|
||||
"password": "sha1$7b94b$ac9e6cf08d0fa16a67e56e319c0935aeb26db2a2",
|
||||
"email": "dejal@newsblur.com",
|
||||
"date_joined": "2009-01-04 17:32:58"
|
||||
}
|
||||
},
|
||||
{
|
||||
"pk": 1206,
|
||||
|
|
|
@ -10,6 +10,7 @@ from apps.profile.tasks import EmailNewUser
|
|||
from apps.social.models import MActivity
|
||||
from apps.profile.models import blank_authenticate, RNewUserQueue
|
||||
from utils import log as logging
|
||||
from dns.resolver import query, NXDOMAIN, NoNameservers
|
||||
|
||||
class LoginForm(forms.Form):
|
||||
username = forms.CharField(label=_("Username or Email"), max_length=30,
|
||||
|
@ -102,19 +103,26 @@ class SignupForm(forms.Form):
|
|||
return self.cleaned_data['password']
|
||||
|
||||
def clean_email(self):
|
||||
return self.cleaned_data['email']
|
||||
|
||||
def clean(self):
|
||||
username = self.cleaned_data.get('username', '')
|
||||
password = self.cleaned_data.get('password', '')
|
||||
email = self.cleaned_data.get('email', None)
|
||||
if email:
|
||||
email_exists = User.objects.filter(email__iexact=email).count()
|
||||
if email_exists:
|
||||
raise forms.ValidationError(_(u'Someone is already using that email address.'))
|
||||
if any([banned in email for banned in ['mailwire24', 'mailbox9', 'scintillamail', 'bluemailboxes', 'devmailing']]):
|
||||
logging.info(" ***> [%s] Spammer signup banned: %s/%s" % (username, password, email))
|
||||
logging.info(" ***> [%s] Spammer signup banned: %s/%s" % (self.cleaned_data.get('username', None), self.cleaned_data.get('password', None), email))
|
||||
raise forms.ValidationError('Seriously, fuck off spammer.')
|
||||
try:
|
||||
domain = email.rsplit('@', 1)[-1]
|
||||
if not query(domain, 'MX'):
|
||||
raise forms.ValidationError('Sorry, that email is invalid.')
|
||||
except (NXDOMAIN, NoNameservers):
|
||||
raise forms.ValidationError('Sorry, that email is invalid.')
|
||||
return self.cleaned_data['email']
|
||||
|
||||
def clean(self):
|
||||
username = self.cleaned_data.get('username', '')
|
||||
password = self.cleaned_data.get('password', '')
|
||||
email = self.cleaned_data.get('email', None)
|
||||
exists = User.objects.filter(username__iexact=username).count()
|
||||
if exists:
|
||||
user_auth = authenticate(username=username, password=password)
|
||||
|
@ -125,12 +133,7 @@ class SignupForm(forms.Form):
|
|||
def save(self, profile_callback=None):
|
||||
username = self.cleaned_data['username']
|
||||
password = self.cleaned_data['password']
|
||||
|
||||
email = self.cleaned_data.get('email', None)
|
||||
if email:
|
||||
email_exists = User.objects.filter(email__iexact=email).count()
|
||||
if email_exists:
|
||||
raise forms.ValidationError(_(u'Someone is already using that email address.'))
|
||||
email = self.cleaned_data['email']
|
||||
|
||||
exists = User.objects.filter(username__iexact=username).count()
|
||||
if exists:
|
||||
|
|
|
@ -339,7 +339,7 @@ class UserSubscription(models.Model):
|
|||
logging.user(user, "~FRAdding URL: ~SB%s (in %s) %s" % (feed_address, folder,
|
||||
"~FCAUTO-ADD" if not auto_active else ""))
|
||||
|
||||
feed = Feed.get_feed_from_url(feed_address)
|
||||
feed = Feed.get_feed_from_url(feed_address, user=user)
|
||||
|
||||
if not feed:
|
||||
code = -1
|
||||
|
@ -383,6 +383,7 @@ class UserSubscription(models.Model):
|
|||
MActivity.new_feed_subscription(user_id=user.pk, feed_id=feed.pk, feed_title=feed.title)
|
||||
|
||||
feed.setup_feed_for_premium_subscribers()
|
||||
feed.count_subscribers()
|
||||
|
||||
return code, message, us
|
||||
|
||||
|
@ -602,9 +603,14 @@ class UserSubscription(models.Model):
|
|||
cutoff_date = datetime.datetime.utcnow()
|
||||
recount = False
|
||||
|
||||
self.last_read_date = cutoff_date
|
||||
self.mark_read_date = cutoff_date
|
||||
self.oldest_unread_story_date = cutoff_date
|
||||
if cutoff_date > self.mark_read_date or cutoff_date > self.oldest_unread_story_date:
|
||||
self.last_read_date = cutoff_date
|
||||
self.mark_read_date = cutoff_date
|
||||
self.oldest_unread_story_date = cutoff_date
|
||||
else:
|
||||
logging.user(self.user, "Not marking %s as read: %s > %s/%s" %
|
||||
(self, cutoff_date, self.mark_read_date, self.oldest_unread_story_date))
|
||||
|
||||
if not recount:
|
||||
self.unread_count_negative = 0
|
||||
self.unread_count_positive = 0
|
||||
|
@ -1288,13 +1294,17 @@ class UserSubscriptionFolders(models.Model):
|
|||
|
||||
return _arrange_folder(user_sub_folders)
|
||||
|
||||
def flatten_folders(self, feeds=None):
|
||||
def flatten_folders(self, feeds=None, inactive_feeds=None):
|
||||
folders = json.decode(self.folders)
|
||||
flat_folders = {" ": []}
|
||||
if feeds and not inactive_feeds:
|
||||
inactive_feeds = []
|
||||
|
||||
def _flatten_folders(items, parent_folder="", depth=0):
|
||||
for item in items:
|
||||
if isinstance(item, int) and ((not feeds) or (feeds and item in feeds)):
|
||||
if (isinstance(item, int) and
|
||||
(not feeds or
|
||||
(item in feeds or item in inactive_feeds))):
|
||||
if not parent_folder:
|
||||
parent_folder = ' '
|
||||
if parent_folder in flat_folders:
|
||||
|
@ -1317,17 +1327,18 @@ class UserSubscriptionFolders(models.Model):
|
|||
return flat_folders
|
||||
|
||||
def delete_feed(self, feed_id, in_folder, commit_delete=True):
|
||||
feed_id = int(feed_id)
|
||||
def _find_feed_in_folders(old_folders, folder_name='', multiples_found=False, deleted=False):
|
||||
new_folders = []
|
||||
for k, folder in enumerate(old_folders):
|
||||
if isinstance(folder, int):
|
||||
if (folder == feed_id and in_folder is not None and (
|
||||
(folder_name != in_folder) or
|
||||
(folder_name == in_folder and deleted))):
|
||||
(in_folder not in folder_name) or
|
||||
(in_folder in folder_name and deleted))):
|
||||
multiples_found = True
|
||||
logging.user(self.user, "~FB~SBDeleting feed, and a multiple has been found in '%s'" % (folder_name))
|
||||
logging.user(self.user, "~FB~SBDeleting feed, and a multiple has been found in '%s' / '%s' %s" % (folder_name, in_folder, '(deleted)' if deleted else ''))
|
||||
if (folder == feed_id and
|
||||
(folder_name == in_folder or in_folder is None) and
|
||||
(in_folder is None or in_folder in folder_name) and
|
||||
not deleted):
|
||||
logging.user(self.user, "~FBDelete feed: %s'th item: %s folders/feeds" % (
|
||||
k, len(old_folders)
|
||||
|
@ -1371,7 +1382,7 @@ class UserSubscriptionFolders(models.Model):
|
|||
feeds_to_delete.remove(folder)
|
||||
elif isinstance(folder, dict):
|
||||
for f_k, f_v in folder.items():
|
||||
if f_k == folder_to_delete and (folder_name == in_folder or in_folder is None):
|
||||
if f_k == folder_to_delete and (in_folder in folder_name or in_folder is None):
|
||||
logging.user(self.user, "~FBDeleting folder '~SB%s~SN' in '%s': %s" % (f_k, folder_name, folder))
|
||||
deleted_folder = folder
|
||||
else:
|
||||
|
@ -1407,7 +1418,7 @@ class UserSubscriptionFolders(models.Model):
|
|||
elif isinstance(folder, dict):
|
||||
for f_k, f_v in folder.items():
|
||||
nf = _find_folder_in_folders(f_v, f_k)
|
||||
if f_k == folder_to_rename and folder_name == in_folder:
|
||||
if f_k == folder_to_rename and in_folder in folder_name:
|
||||
logging.user(self.user, "~FBRenaming folder '~SB%s~SN' in '%s' to: ~SB%s" % (
|
||||
f_k, folder_name, new_folder_name))
|
||||
f_k = new_folder_name
|
||||
|
@ -1462,6 +1473,7 @@ class UserSubscriptionFolders(models.Model):
|
|||
logging.user(self.user, "~FBMoving ~SB%s~SN feeds to folder: ~SB%s" % (
|
||||
len(feeds_by_folder), to_folder))
|
||||
for feed_id, in_folder in feeds_by_folder:
|
||||
feed_id = int(feed_id)
|
||||
self.move_feed_to_folder(feed_id, in_folder, to_folder)
|
||||
|
||||
return self
|
||||
|
|
|
@ -41,6 +41,6 @@ class CleanAnalytics(Task):
|
|||
settings.MONGOANALYTICSDB.nbanalytics.feed_fetches.count(),
|
||||
))
|
||||
day_ago = datetime.datetime.utcnow() - datetime.timedelta(days=1)
|
||||
settings.MONGOANALYTICSDB.nbanalytics.feed_fetches.remove({
|
||||
settings.MONGOANALYTICSDB.nbanalytics.feed_fetches.delete_many({
|
||||
"date": {"$lt": day_ago},
|
||||
})
|
||||
|
|
|
@ -96,6 +96,22 @@ class ReaderTest(TestCase):
|
|||
response = self.client.get(reverse('load-feeds'))
|
||||
feeds = json.decode(response.content)
|
||||
self.assertEquals(feeds['folders'], [2, 3, 8, 9, {'Tech': [1, 4, 5, {'Deep Tech': [6, 7]}]}, {'Blogs': [8, 9]}])
|
||||
|
||||
def test_move_feeds_by_folder(self):
|
||||
self.client.login(username='Dejal', password='test')
|
||||
|
||||
response = self.client.get(reverse('load-feeds'))
|
||||
feeds = json.decode(response.content)
|
||||
self.assertEquals(feeds['folders'], [5299728, 644144, 1187026, {"Brainiacs & Opinion": [569, 38, 3581, 183139, 1186180, 15]}, {"Science & Technology": [731503, 140145, 1272495, 76, 161, 39, {"Hacker": [5985150, 3323431]}]}, {"Humor": [212379, 3530, 5994357]}, {"Videos": [3240, 5168]}])
|
||||
|
||||
# Move feeds by folder
|
||||
response = self.client.post(reverse('move-feeds-by-folder-to-folder'), {u'feeds_by_folder': u'[\n [\n "5994357",\n "Humor"\n ],\n [\n "3530",\n "Humor"\n ]\n]', u'to_folder': u'Brainiacs & Opinion'})
|
||||
response = json.decode(response.content)
|
||||
self.assertEquals(response['code'], 1)
|
||||
|
||||
response = self.client.get(reverse('load-feeds'))
|
||||
feeds = json.decode(response.content)
|
||||
self.assertEquals(feeds['folders'], [5299728, 644144, 1187026, {"Brainiacs & Opinion": [569, 38, 3581, 183139, 1186180, 15, 5994357, 3530]}, {"Science & Technology": [731503, 140145, 1272495, 76, 161, 39, {"Hacker": [5985150, 3323431]}]}, {"Humor": [212379]}, {"Videos": [3240, 5168]}])
|
||||
|
||||
def test_load_single_feed(self):
|
||||
# from django.conf import settings
|
||||
|
|
|
@ -5,6 +5,7 @@ import redis
|
|||
import requests
|
||||
import random
|
||||
import zlib
|
||||
import re
|
||||
from django.shortcuts import get_object_or_404
|
||||
from django.shortcuts import render
|
||||
from django.contrib.auth.decorators import login_required
|
||||
|
@ -35,6 +36,7 @@ from apps.profile.models import Profile
|
|||
from apps.reader.models import UserSubscription, UserSubscriptionFolders, RUserStory, Feature
|
||||
from apps.reader.forms import SignupForm, LoginForm, FeatureForm
|
||||
from apps.rss_feeds.models import MFeedIcon, MStarredStoryCounts
|
||||
from apps.notifications.models import MUserFeedNotification
|
||||
from apps.search.models import MUserSearch
|
||||
from apps.statistics.models import MStatistics
|
||||
# from apps.search.models import SearchStarredStory
|
||||
|
@ -58,7 +60,7 @@ from utils.view_functions import get_argument_or_404, render_to, is_true
|
|||
from utils.view_functions import required_params
|
||||
from utils.ratelimit import ratelimit
|
||||
from vendor.timezones.utilities import localtime_for_timezone
|
||||
|
||||
import tweepy
|
||||
|
||||
BANNED_URLS = [
|
||||
"brentozar.com",
|
||||
|
@ -221,7 +223,7 @@ def autologin(request, username, secret):
|
|||
else:
|
||||
return HttpResponseRedirect(reverse('index'))
|
||||
|
||||
@ratelimit(minutes=1, requests=24)
|
||||
@ratelimit(minutes=1, requests=60)
|
||||
@never_cache
|
||||
@json.json_view
|
||||
def load_feeds(request):
|
||||
|
@ -248,6 +250,7 @@ def load_feeds(request):
|
|||
folders = UserSubscriptionFolders.objects.get(user=user)
|
||||
|
||||
user_subs = UserSubscription.objects.select_related('feed').filter(user=user)
|
||||
notifications = MUserFeedNotification.feeds_for_user(user.pk)
|
||||
|
||||
day_ago = datetime.datetime.now() - datetime.timedelta(days=1)
|
||||
scheduled_feeds = []
|
||||
|
@ -258,6 +261,8 @@ def load_feeds(request):
|
|||
feeds[pk] = sub.canonical(include_favicon=include_favicons)
|
||||
|
||||
if not sub.active: continue
|
||||
if pk in notifications:
|
||||
feeds[pk].update(notifications[pk])
|
||||
if not sub.feed.active and not sub.feed.has_feed_exception:
|
||||
scheduled_feeds.append(sub.feed.pk)
|
||||
elif sub.feed.active_subscribers <= 0:
|
||||
|
@ -297,6 +302,7 @@ def load_feeds(request):
|
|||
'social_services': social_services,
|
||||
'user_profile': user.profile,
|
||||
"is_staff": user.is_staff,
|
||||
'user_id': user.pk,
|
||||
'folders': json.decode(folders.folders),
|
||||
'starred_count': starred_count,
|
||||
'starred_counts': starred_counts,
|
||||
|
@ -321,8 +327,10 @@ def load_feeds_flat(request):
|
|||
user = request.user
|
||||
include_favicons = is_true(request.REQUEST.get('include_favicons', False))
|
||||
update_counts = is_true(request.REQUEST.get('update_counts', True))
|
||||
include_inactive = is_true(request.REQUEST.get('include_inactive', False))
|
||||
|
||||
feeds = {}
|
||||
inactive_feeds = {}
|
||||
day_ago = datetime.datetime.now() - datetime.timedelta(days=1)
|
||||
scheduled_feeds = []
|
||||
iphone_version = "2.1" # Preserved forever. Don't change.
|
||||
|
@ -341,20 +349,31 @@ def load_feeds_flat(request):
|
|||
folders = []
|
||||
|
||||
user_subs = UserSubscription.objects.select_related('feed').filter(user=user, active=True)
|
||||
notifications = MUserFeedNotification.feeds_for_user(user.pk)
|
||||
if not user_subs and folders:
|
||||
folders.auto_activate()
|
||||
user_subs = UserSubscription.objects.select_related('feed').filter(user=user, active=True)
|
||||
|
||||
if include_inactive:
|
||||
inactive_subs = UserSubscription.objects.select_related('feed').filter(user=user, active=False)
|
||||
|
||||
for sub in user_subs:
|
||||
pk = sub.feed_id
|
||||
if update_counts and sub.needs_unread_recalc:
|
||||
sub.calculate_feed_scores(silent=True)
|
||||
feeds[sub.feed_id] = sub.canonical(include_favicon=include_favicons)
|
||||
feeds[pk] = sub.canonical(include_favicon=include_favicons)
|
||||
if not sub.feed.active and not sub.feed.has_feed_exception:
|
||||
scheduled_feeds.append(sub.feed.pk)
|
||||
elif sub.feed.active_subscribers <= 0:
|
||||
scheduled_feeds.append(sub.feed.pk)
|
||||
elif sub.feed.next_scheduled_update < day_ago:
|
||||
scheduled_feeds.append(sub.feed.pk)
|
||||
if pk in notifications:
|
||||
feeds[pk].update(notifications[pk])
|
||||
|
||||
|
||||
if include_inactive:
|
||||
for sub in inactive_subs:
|
||||
inactive_feeds[sub.feed_id] = sub.canonical(include_favicon=include_favicons)
|
||||
|
||||
if len(scheduled_feeds) > 0 and request.user.is_authenticated():
|
||||
logging.user(request, "~SN~FMTasking the scheduling immediate fetch of ~SB%s~SN feeds..." %
|
||||
|
@ -362,8 +381,11 @@ def load_feeds_flat(request):
|
|||
ScheduleImmediateFetches.apply_async(kwargs=dict(feed_ids=scheduled_feeds, user_id=user.pk))
|
||||
|
||||
flat_folders = []
|
||||
flat_folders_with_inactive = []
|
||||
if folders:
|
||||
flat_folders = folders.flatten_folders(feeds=feeds)
|
||||
flat_folders_with_inactive = folders.flatten_folders(feeds=feeds,
|
||||
inactive_feeds=inactive_feeds)
|
||||
|
||||
social_params = {
|
||||
'user_id': user.pk,
|
||||
|
@ -381,16 +403,19 @@ def load_feeds_flat(request):
|
|||
if not user_subs:
|
||||
categories = MCategory.serialize()
|
||||
|
||||
logging.user(request, "~FB~SBLoading ~FY%s~FB/~FM%s~FB feeds/socials ~FMflat~FB%s" % (
|
||||
len(feeds.keys()), len(social_feeds), '. ~FCUpdating counts.' if update_counts else ''))
|
||||
logging.user(request, "~FB~SBLoading ~FY%s~FB/~FM%s~FB/~FR%s~FB feeds/socials/inactive ~FMflat~FB%s" % (
|
||||
len(feeds.keys()), len(social_feeds), len(inactive_feeds), '. ~FCUpdating counts.' if update_counts else ''))
|
||||
|
||||
data = {
|
||||
"flat_folders": flat_folders,
|
||||
"feeds": feeds,
|
||||
"flat_folders_with_inactive": flat_folders_with_inactive,
|
||||
"feeds": feeds if not include_inactive else {"0": "Don't include `include_inactive=true` if you want active feeds."},
|
||||
"inactive_feeds": inactive_feeds if include_inactive else {"0": "Include `include_inactive=true`"},
|
||||
"social_feeds": social_feeds,
|
||||
"social_profile": social_profile,
|
||||
"social_services": social_services,
|
||||
"user": user.username,
|
||||
"user_id": user.pk,
|
||||
"is_staff": user.is_staff,
|
||||
"user_profile": user.profile,
|
||||
"iphone_version": iphone_version,
|
||||
|
@ -539,9 +564,10 @@ def load_single_feed(request, feed_id):
|
|||
offset = limit * (page-1)
|
||||
order = request.REQUEST.get('order', 'newest')
|
||||
read_filter = request.REQUEST.get('read_filter', 'all')
|
||||
query = request.REQUEST.get('query')
|
||||
query = request.REQUEST.get('query', '').strip()
|
||||
include_story_content = is_true(request.REQUEST.get('include_story_content', True))
|
||||
include_hidden = is_true(request.REQUEST.get('include_hidden', False))
|
||||
include_feeds = is_true(request.REQUEST.get('include_feeds', False))
|
||||
message = None
|
||||
user_search = None
|
||||
|
||||
|
@ -560,6 +586,10 @@ def load_single_feed(request, feed_id):
|
|||
except UserSubscription.DoesNotExist:
|
||||
usersub = None
|
||||
|
||||
if feed.is_newsletter and not usersub:
|
||||
# User must be subscribed to a newsletter in order to read it
|
||||
raise Http404
|
||||
|
||||
if query:
|
||||
if user.profile.is_premium:
|
||||
user_search = MUserSearch.get_user(user.pk)
|
||||
|
@ -620,12 +650,14 @@ def load_single_feed(request, feed_id):
|
|||
starred_stories = MStarredStory.objects(user_id=user.pk,
|
||||
story_feed_id=feed.pk,
|
||||
story_hash__in=story_hashes)\
|
||||
.hint([('user_id', 1), ('story_hash', 1)])\
|
||||
.only('story_hash', 'starred_date', 'user_tags')
|
||||
shared_story_hashes = MSharedStory.check_shared_story_hashes(user.pk, story_hashes)
|
||||
shared_stories = []
|
||||
if shared_story_hashes:
|
||||
shared_stories = MSharedStory.objects(user_id=user.pk,
|
||||
story_hash__in=shared_story_hashes)\
|
||||
.hint([('story_hash', 1)])\
|
||||
.only('story_hash', 'shared_date', 'comments')
|
||||
starred_stories = dict([(story.story_hash, dict(starred_date=story.starred_date,
|
||||
user_tags=story.user_tags))
|
||||
|
@ -678,6 +710,10 @@ def load_single_feed(request, feed_id):
|
|||
feed_tags = json.decode(feed.data.popular_tags) if feed.data.popular_tags else []
|
||||
feed_authors = json.decode(feed.data.popular_authors) if feed.data.popular_authors else []
|
||||
|
||||
if include_feeds:
|
||||
feeds = Feed.objects.filter(pk__in=set([story['story_feed_id'] for story in stories]))
|
||||
feeds = [feed.canonical(include_favicon=False) for feed in feeds]
|
||||
|
||||
if usersub:
|
||||
usersub.feed_opens += 1
|
||||
usersub.needs_unread_recalc = True
|
||||
|
@ -708,7 +744,7 @@ def load_single_feed(request, feed_id):
|
|||
hidden_stories_removed += 1
|
||||
stories = new_stories
|
||||
|
||||
data = dict(stories=stories,
|
||||
data = dict(stories=stories,
|
||||
user_profiles=user_profiles,
|
||||
feed_tags=feed_tags,
|
||||
feed_authors=feed_authors,
|
||||
|
@ -719,6 +755,7 @@ def load_single_feed(request, feed_id):
|
|||
elapsed_time=round(float(timediff), 2),
|
||||
message=message)
|
||||
|
||||
if include_feeds: data['feeds'] = feeds
|
||||
if not include_hidden: data['hidden_stories_removed'] = hidden_stories_removed
|
||||
if dupe_feed_id: data['dupe_feed_id'] = dupe_feed_id
|
||||
if not usersub:
|
||||
|
@ -728,7 +765,8 @@ def load_single_feed(request, feed_id):
|
|||
|
||||
# if page <= 3:
|
||||
# import random
|
||||
# time.sleep(random.randint(2, 4))
|
||||
# time.sleep(random.randint(2, 7) / 10.0)
|
||||
# # time.sleep(random.randint(2, 14))
|
||||
|
||||
# if page == 2:
|
||||
# assert False
|
||||
|
@ -791,7 +829,7 @@ def load_starred_stories(request):
|
|||
offset = int(request.REQUEST.get('offset', 0))
|
||||
limit = int(request.REQUEST.get('limit', 10))
|
||||
page = int(request.REQUEST.get('page', 0))
|
||||
query = request.REQUEST.get('query')
|
||||
query = request.REQUEST.get('query', '').strip()
|
||||
order = request.REQUEST.get('order', 'newest')
|
||||
tag = request.REQUEST.get('tag')
|
||||
story_hashes = request.REQUEST.getlist('h')[:100]
|
||||
|
@ -846,6 +884,7 @@ def load_starred_stories(request):
|
|||
if shared_story_hashes:
|
||||
shared_stories = MSharedStory.objects(user_id=user.pk,
|
||||
story_hash__in=shared_story_hashes)\
|
||||
.hint([('story_hash', 1)])\
|
||||
.only('story_hash', 'shared_date', 'comments')
|
||||
shared_stories = dict([(story.story_hash, dict(shared_date=story.shared_date,
|
||||
comments=story.comments))
|
||||
|
@ -1095,7 +1134,7 @@ def load_read_stories(request):
|
|||
limit = int(request.REQUEST.get('limit', 10))
|
||||
page = int(request.REQUEST.get('page', 0))
|
||||
order = request.REQUEST.get('order', 'newest')
|
||||
query = request.REQUEST.get('query')
|
||||
query = request.REQUEST.get('query', '').strip()
|
||||
now = localtime_for_timezone(datetime.datetime.now(), user.profile.timezone)
|
||||
message = None
|
||||
if page: offset = limit * (page - 1)
|
||||
|
@ -1127,12 +1166,14 @@ def load_read_stories(request):
|
|||
|
||||
shared_stories = MSharedStory.objects(user_id=user.pk,
|
||||
story_hash__in=story_hashes)\
|
||||
.hint([('story_hash', 1)])\
|
||||
.only('story_hash', 'shared_date', 'comments')
|
||||
shared_stories = dict([(story.story_hash, dict(shared_date=story.shared_date,
|
||||
comments=story.comments))
|
||||
for story in shared_stories])
|
||||
starred_stories = MStarredStory.objects(user_id=user.pk,
|
||||
story_hash__in=story_hashes)\
|
||||
.hint([('user_id', 1), ('story_hash', 1)])\
|
||||
.only('story_hash', 'starred_date')
|
||||
starred_stories = dict([(story.story_hash, story.starred_date)
|
||||
for story in starred_stories])
|
||||
|
@ -1183,8 +1224,9 @@ def load_river_stories__redis(request):
|
|||
page = int(request.REQUEST.get('page', 1))
|
||||
order = request.REQUEST.get('order', 'newest')
|
||||
read_filter = request.REQUEST.get('read_filter', 'unread')
|
||||
query = request.REQUEST.get('query')
|
||||
query = request.REQUEST.get('query', '').strip()
|
||||
include_hidden = is_true(request.REQUEST.get('include_hidden', False))
|
||||
include_feeds = is_true(request.REQUEST.get('include_feeds', False))
|
||||
now = localtime_for_timezone(datetime.datetime.now(), user.profile.timezone)
|
||||
usersubs = []
|
||||
code = 1
|
||||
|
@ -1261,9 +1303,10 @@ def load_river_stories__redis(request):
|
|||
if read_filter == 'starred':
|
||||
starred_stories = mstories
|
||||
else:
|
||||
story_hashes = [s['story_hash'] for s in stories]
|
||||
starred_stories = MStarredStory.objects(
|
||||
user_id=user.pk,
|
||||
story_feed_id__in=found_feed_ids
|
||||
story_hash__in=story_hashes
|
||||
).only('story_hash', 'starred_date')
|
||||
starred_stories = dict([(story.story_hash, dict(starred_date=story.starred_date,
|
||||
user_tags=story.user_tags))
|
||||
|
@ -1321,9 +1364,12 @@ def load_river_stories__redis(request):
|
|||
'title': apply_classifier_titles(classifier_titles, story),
|
||||
}
|
||||
story['score'] = UserSubscription.score_story(story['intelligence'])
|
||||
|
||||
|
||||
if not user.profile.is_premium:
|
||||
if include_feeds:
|
||||
feeds = Feed.objects.filter(pk__in=set([story['story_feed_id'] for story in stories]))
|
||||
feeds = [feed.canonical(include_favicon=False) for feed in feeds]
|
||||
|
||||
if not user.profile.is_premium and not include_feeds:
|
||||
message = "The full River of News is a premium feature."
|
||||
code = 0
|
||||
# if page > 1:
|
||||
|
@ -1359,7 +1405,8 @@ def load_river_stories__redis(request):
|
|||
elapsed_time=timediff,
|
||||
user_search=user_search,
|
||||
user_profiles=user_profiles)
|
||||
|
||||
|
||||
if include_feeds: data['feeds'] = feeds
|
||||
if not include_hidden: data['hidden_stories_removed'] = hidden_stories_removed
|
||||
|
||||
return data
|
||||
|
@ -1701,6 +1748,10 @@ def mark_feed_as_read(request):
|
|||
errors = []
|
||||
cutoff_date = datetime.datetime.fromtimestamp(cutoff_timestamp) if cutoff_timestamp else None
|
||||
|
||||
if cutoff_date:
|
||||
logging.user(request, "~FMMark %s feeds read, %s - cutoff: %s/%s" %
|
||||
(len(feed_ids), direction, cutoff_timestamp, cutoff_date))
|
||||
|
||||
for feed_id in feed_ids:
|
||||
if 'social:' in feed_id:
|
||||
user_id = int(feed_id.replace('social:', ''))
|
||||
|
@ -1776,20 +1827,37 @@ def add_url(request):
|
|||
elif any([(banned_url in url) for banned_url in BANNED_URLS]):
|
||||
code = -1
|
||||
message = "The publisher of this website has banned NewsBlur."
|
||||
else:
|
||||
if new_folder:
|
||||
usf, _ = UserSubscriptionFolders.objects.get_or_create(user=request.user)
|
||||
usf.add_folder(folder, new_folder)
|
||||
folder = new_folder
|
||||
elif re.match('(https?://)?twitter.com/\w+/?$', url):
|
||||
if not request.user.profile.is_premium:
|
||||
message = "You must be a premium subscriber to add Twitter feeds."
|
||||
code = -1
|
||||
else:
|
||||
# Check if Twitter API is active for user
|
||||
ss = MSocialServices.get_user(request.user.pk)
|
||||
try:
|
||||
if not ss.twitter_uid:
|
||||
raise tweepy.TweepError("No API token")
|
||||
ss.twitter_api().me()
|
||||
except tweepy.TweepError, e:
|
||||
code = -1
|
||||
message = "Your Twitter connection isn't setup. Go to Manage - Friends and reconnect Twitter."
|
||||
|
||||
if code == -1:
|
||||
return dict(code=code, message=message)
|
||||
|
||||
code, message, us = UserSubscription.add_subscription(user=request.user, feed_address=url,
|
||||
folder=folder, auto_active=auto_active,
|
||||
skip_fetch=skip_fetch)
|
||||
feed = us and us.feed
|
||||
if feed:
|
||||
r = redis.Redis(connection_pool=settings.REDIS_PUBSUB_POOL)
|
||||
r.publish(request.user.username, 'reload:%s' % feed.pk)
|
||||
MUserSearch.schedule_index_feeds_for_search(feed.pk, request.user.pk)
|
||||
if new_folder:
|
||||
usf, _ = UserSubscriptionFolders.objects.get_or_create(user=request.user)
|
||||
usf.add_folder(folder, new_folder)
|
||||
folder = new_folder
|
||||
|
||||
code, message, us = UserSubscription.add_subscription(user=request.user, feed_address=url,
|
||||
folder=folder, auto_active=auto_active,
|
||||
skip_fetch=skip_fetch)
|
||||
feed = us and us.feed
|
||||
if feed:
|
||||
r = redis.Redis(connection_pool=settings.REDIS_PUBSUB_POOL)
|
||||
r.publish(request.user.username, 'reload:%s' % feed.pk)
|
||||
MUserSearch.schedule_index_feeds_for_search(feed.pk, request.user.pk)
|
||||
|
||||
return dict(code=code, message=message, feed=feed)
|
||||
|
||||
|
@ -1845,6 +1913,7 @@ def delete_feed_by_url(request):
|
|||
if in_folder == ' ':
|
||||
in_folder = ""
|
||||
|
||||
logging.user(request.user, "~FBFinding feed (delete_feed_by_url): %s" % url)
|
||||
feed = Feed.get_feed_from_url(url, create=False)
|
||||
if feed:
|
||||
user_sub_folders = get_object_or_404(UserSubscriptionFolders, user=request.user)
|
||||
|
@ -1920,6 +1989,7 @@ def rename_folder(request):
|
|||
folder_to_rename = request.POST.get('folder_name') or request.POST.get('folder_to_rename')
|
||||
new_folder_name = request.POST['new_folder_name']
|
||||
in_folder = request.POST.get('in_folder', '')
|
||||
if 'Top Level' in in_folder: in_folder = ''
|
||||
code = 0
|
||||
|
||||
# Works piss poor with duplicate folder titles, if they are both in the same folder.
|
||||
|
@ -2081,14 +2151,17 @@ def feeds_trainer(request):
|
|||
def save_feed_chooser(request):
|
||||
is_premium = request.user.profile.is_premium
|
||||
approved_feeds = [int(feed_id) for feed_id in request.POST.getlist('approved_feeds') if feed_id]
|
||||
approve_all = False
|
||||
if not is_premium:
|
||||
approved_feeds = approved_feeds[:64]
|
||||
elif is_premium and not approved_feeds:
|
||||
approve_all = True
|
||||
activated = 0
|
||||
usersubs = UserSubscription.objects.filter(user=request.user)
|
||||
|
||||
for sub in usersubs:
|
||||
try:
|
||||
if sub.feed_id in approved_feeds:
|
||||
if sub.feed_id in approved_feeds or approve_all:
|
||||
activated += 1
|
||||
if not sub.active:
|
||||
sub.active = True
|
||||
|
@ -2200,6 +2273,8 @@ def _mark_story_as_starred(request):
|
|||
removed_user_tags = []
|
||||
if not starred_story:
|
||||
params.update(story_values)
|
||||
if params.has_key('story_latest_content_z'):
|
||||
params.pop('story_latest_content_z')
|
||||
starred_story = MStarredStory.objects.create(**params)
|
||||
created = True
|
||||
MActivity.new_starred_story(user_id=request.user.pk,
|
||||
|
|
|
@ -10,6 +10,7 @@ import operator
|
|||
import gzip
|
||||
import datetime
|
||||
import requests
|
||||
import httplib
|
||||
from PIL import BmpImagePlugin, PngImagePlugin, Image
|
||||
from socket import error as SocketError
|
||||
from boto.s3.key import Key
|
||||
|
@ -62,11 +63,17 @@ class IconImporter(object):
|
|||
image = None
|
||||
if (image and
|
||||
(self.force or
|
||||
self.feed_icon.color != color or
|
||||
self.feed_icon.data != image_str or
|
||||
self.feed_icon.icon_url != icon_url or
|
||||
self.feed_icon.not_found or
|
||||
(settings.BACKED_BY_AWS.get('icons_on_s3') and not self.feed.s3_icon))):
|
||||
logging.debug(" ---> [%-30s] ~SN~FBIcon difference:~FY color:%s (%s/%s) data:%s url:%s notfound:%s no-s3:%s" % (
|
||||
self.feed,
|
||||
self.feed_icon.color != color, self.feed_icon.color, color,
|
||||
self.feed_icon.data != image_str,
|
||||
self.feed_icon.icon_url != icon_url,
|
||||
self.feed_icon.not_found,
|
||||
settings.BACKED_BY_AWS.get('icons_on_s3') and not self.feed.s3_icon))
|
||||
self.feed_icon.data = image_str
|
||||
self.feed_icon.icon_url = icon_url
|
||||
self.feed_icon.color = color
|
||||
|
@ -81,8 +88,10 @@ class IconImporter(object):
|
|||
|
||||
if not image:
|
||||
self.feed_icon.not_found = True
|
||||
self.feed_icon.save()
|
||||
self.feed.favicon_not_found = True
|
||||
|
||||
self.feed.save()
|
||||
|
||||
return not self.feed.favicon_not_found
|
||||
|
||||
def save_to_s3(self, image_str):
|
||||
|
@ -96,6 +105,7 @@ class IconImporter(object):
|
|||
k.set_acl('public-read')
|
||||
|
||||
self.feed.s3_icon = True
|
||||
self.feed.save()
|
||||
|
||||
def load_icon(self, image_file, index=None):
|
||||
'''
|
||||
|
@ -198,7 +208,7 @@ class IconImporter(object):
|
|||
url = self._url_from_html(content)
|
||||
if not url:
|
||||
try:
|
||||
content = requests.get(self.feed.feed_link).content
|
||||
content = requests.get(self.cleaned_feed_link).content
|
||||
url = self._url_from_html(content)
|
||||
except (AttributeError, SocketError, requests.ConnectionError,
|
||||
requests.models.MissingSchema, requests.sessions.InvalidSchema,
|
||||
|
@ -206,12 +216,19 @@ class IconImporter(object):
|
|||
requests.models.InvalidURL,
|
||||
requests.models.ChunkedEncodingError,
|
||||
requests.models.ContentDecodingError,
|
||||
httplib.IncompleteRead,
|
||||
LocationParseError, OpenSSLError, PyAsn1Error), e:
|
||||
logging.debug(" ---> ~SN~FRFailed~FY to fetch ~FGfeed icon~FY: %s" % e)
|
||||
if url:
|
||||
image, image_file = self.get_image_from_url(url)
|
||||
return image, image_file, url
|
||||
|
||||
|
||||
@property
|
||||
def cleaned_feed_link(self):
|
||||
if self.feed.feed_link.startswith('http'):
|
||||
return self.feed.feed_link
|
||||
return 'http://' + self.feed.feed_link
|
||||
|
||||
def fetch_image_from_path(self, path='favicon.ico', force=False):
|
||||
image = None
|
||||
url = None
|
||||
|
@ -312,8 +329,9 @@ class IconImporter(object):
|
|||
# Reshape array of values to merge color bands. [[R], [G], [B], [A]] => [R, G, B, A]
|
||||
if len(shape) > 2:
|
||||
ar = ar.reshape(scipy.product(shape[:2]), shape[2])
|
||||
|
||||
|
||||
# Get NUM_CLUSTERS worth of centroids.
|
||||
ar = ar.astype(numpy.float)
|
||||
codes, _ = scipy.cluster.vq.kmeans(ar, NUM_CLUSTERS)
|
||||
|
||||
# Pare centroids, removing blacks and whites and shades of really dark and really light.
|
||||
|
@ -341,7 +359,7 @@ class IconImporter(object):
|
|||
|
||||
# Find the most frequent color, based on the counts.
|
||||
index_max = scipy.argmax(counts)
|
||||
peak = codes[index_max]
|
||||
peak = codes.astype(int)[index_max]
|
||||
color = ''.join(chr(c) for c in peak).encode('hex')
|
||||
|
||||
return color[:6]
|
||||
|
|
22
apps/rss_feeds/management/commands/query_popularity.py
Normal file
|
@ -0,0 +1,22 @@
|
|||
from django.core.management.base import BaseCommand
|
||||
from apps.reader.models import UserSubscription
|
||||
from django.conf import settings
|
||||
from optparse import make_option
|
||||
from django.contrib.auth.models import User
|
||||
from apps.rss_feeds.models import Feed
|
||||
import os
|
||||
import errno
|
||||
import re
|
||||
import datetime
|
||||
|
||||
class Command(BaseCommand):
|
||||
option_list = BaseCommand.option_list + (
|
||||
make_option("-q", "--query", dest="query", help="Search query"),
|
||||
make_option("-l", "--limit", dest="limit", type="int", default=1000, help="Limit of stories"),
|
||||
)
|
||||
|
||||
def handle(self, *args, **options):
|
||||
# settings.LOG_TO_STREAM = True
|
||||
|
||||
# Feed.query_popularity(options['query'], limit=options['limit'])
|
||||
Feed.xls_query_popularity(options['query'], limit=options['limit'])
|
|
@ -20,6 +20,7 @@ from django.db import models
|
|||
from django.db import IntegrityError
|
||||
from django.conf import settings
|
||||
from django.db.models.query import QuerySet
|
||||
from django.db.utils import DatabaseError
|
||||
from django.core.urlresolvers import reverse
|
||||
from django.contrib.auth.models import User
|
||||
from django.contrib.sites.models import Site
|
||||
|
@ -33,7 +34,7 @@ from apps.rss_feeds.text_importer import TextImporter
|
|||
from apps.search.models import SearchStory, SearchFeed
|
||||
from apps.statistics.rstats import RStats
|
||||
from utils import json_functions as json
|
||||
from utils import feedfinder, feedparser
|
||||
from utils import feedfinder2 as feedfinder
|
||||
from utils import urlnorm
|
||||
from utils import log as logging
|
||||
from utils.fields import AutoOneToOneField
|
||||
|
@ -142,7 +143,17 @@ class Feed(models.Model):
|
|||
return datetime.datetime.utcnow() - datetime.timedelta(days=settings.DAYS_OF_UNREAD)
|
||||
|
||||
return datetime.datetime.utcnow() - datetime.timedelta(days=settings.DAYS_OF_UNREAD_FREE)
|
||||
|
||||
|
||||
@classmethod
|
||||
def generate_hash_address_and_link(cls, feed_address, feed_link):
|
||||
if not feed_address: feed_address = ""
|
||||
if not feed_link: feed_link = ""
|
||||
return hashlib.sha1(feed_address+feed_link).hexdigest()
|
||||
|
||||
@property
|
||||
def is_newsletter(self):
|
||||
return self.feed_address.startswith('newsletter:')
|
||||
|
||||
def canonical(self, full=False, include_favicon=True):
|
||||
feed = {
|
||||
'id': self.pk,
|
||||
|
@ -159,6 +170,7 @@ class Feed(models.Model):
|
|||
'min_to_decay': self.min_to_decay,
|
||||
'subs': self.num_subscribers,
|
||||
'is_push': self.is_push,
|
||||
'is_newsletter': self.is_newsletter,
|
||||
'fetched_once': self.fetched_once,
|
||||
'search_indexed': self.search_indexed,
|
||||
'not_yet_fetched': not self.fetched_once, # Legacy. Doh.
|
||||
|
@ -206,7 +218,7 @@ class Feed(models.Model):
|
|||
|
||||
feed_address = self.feed_address or ""
|
||||
feed_link = self.feed_link or ""
|
||||
self.hash_address_and_link = hashlib.sha1(feed_address+feed_link).hexdigest()
|
||||
self.hash_address_and_link = self.generate_hash_address_and_link(feed_address, feed_link)
|
||||
|
||||
max_feed_title = Feed._meta.get_field('feed_title').max_length
|
||||
if len(self.feed_title) > max_feed_title:
|
||||
|
@ -221,24 +233,32 @@ class Feed(models.Model):
|
|||
try:
|
||||
super(Feed, self).save(*args, **kwargs)
|
||||
except IntegrityError, e:
|
||||
logging.debug(" ---> ~FRFeed save collision (%s), checking dupe..." % e)
|
||||
duplicate_feeds = Feed.objects.filter(feed_address=self.feed_address,
|
||||
feed_link=self.feed_link)
|
||||
logging.debug(" ---> ~FRFeed save collision (%s), checking dupe hash..." % e)
|
||||
feed_address = self.feed_address or ""
|
||||
feed_link = self.feed_link or ""
|
||||
hash_address_and_link = self.generate_hash_address_and_link(feed_address, feed_link)
|
||||
logging.debug(" ---> ~FRNo dupes, checking hash collision: %s" % hash_address_and_link)
|
||||
duplicate_feeds = Feed.objects.filter(hash_address_and_link=hash_address_and_link)
|
||||
|
||||
if not duplicate_feeds:
|
||||
feed_address = self.feed_address or ""
|
||||
feed_link = self.feed_link or ""
|
||||
hash_address_and_link = hashlib.sha1(feed_address+feed_link).hexdigest()
|
||||
duplicate_feeds = Feed.objects.filter(hash_address_and_link=hash_address_and_link)
|
||||
duplicate_feeds = Feed.objects.filter(feed_address=self.feed_address,
|
||||
feed_link=self.feed_link)
|
||||
if not duplicate_feeds:
|
||||
# Feed has been deleted. Just ignore it.
|
||||
logging.debug(" ***> Changed to: %s - %s: %s" % (self.feed_address, self.feed_link, duplicate_feeds))
|
||||
logging.debug(' ***> [%-30s] Feed deleted (%s).' % (unicode(self)[:30], self.pk))
|
||||
return
|
||||
|
||||
if self.pk != duplicate_feeds[0].pk:
|
||||
logging.debug(" ---> ~FRFound different feed (%s), merging %s in..." % (duplicate_feeds[0], self.pk))
|
||||
feed = Feed.get_by_id(merge_feeds(duplicate_feeds[0].pk, self.pk))
|
||||
return feed
|
||||
|
||||
for duplicate_feed in duplicate_feeds:
|
||||
if duplicate_feed.pk != self.pk:
|
||||
logging.debug(" ---> ~FRFound different feed (%s), merging %s in..." % (duplicate_feeds[0], self.pk))
|
||||
feed = Feed.get_by_id(merge_feeds(duplicate_feeds[0].pk, self.pk))
|
||||
return feed
|
||||
else:
|
||||
logging.debug(" ---> ~FRFeed is its own dupe? %s == %s" % (self, duplicate_feeds))
|
||||
except DatabaseError, e:
|
||||
logging.debug(" ---> ~FBFeed update failed, no change: %s / %s..." % (kwargs.get('update_fields', None), e))
|
||||
pass
|
||||
|
||||
return self
|
||||
|
||||
|
@ -258,7 +278,7 @@ class Feed(models.Model):
|
|||
Feed.objects.get(pk=feed_id).index_feed_for_search()
|
||||
|
||||
def index_feed_for_search(self):
|
||||
if self.num_subscribers > 1 and not self.branch_from_feed:
|
||||
if self.num_subscribers > 1 and not self.branch_from_feed and not self.is_newsletter:
|
||||
SearchFeed.index(feed_id=self.pk,
|
||||
title=self.feed_title,
|
||||
address=self.feed_address,
|
||||
|
@ -356,15 +376,26 @@ class Feed(models.Model):
|
|||
return bool(not (self.favicon_not_found or self.favicon_color))
|
||||
|
||||
@classmethod
|
||||
def get_feed_from_url(cls, url, create=True, aggressive=False, fetch=True, offset=0):
|
||||
def get_feed_from_url(cls, url, create=True, aggressive=False, fetch=True, offset=0, user=None):
|
||||
feed = None
|
||||
without_rss = False
|
||||
|
||||
if url and url.startswith('newsletter:'):
|
||||
return cls.objects.get(feed_address=url)
|
||||
if url and re.match('(https?://)?twitter.com/\w+/?$', url):
|
||||
without_rss = True
|
||||
if url and 'youtube.com/user/' in url:
|
||||
username = re.search('youtube.com/user/(\w+)', url).group(1)
|
||||
url = "http://gdata.youtube.com/feeds/base/users/%s/uploads" % username
|
||||
without_rss = True
|
||||
if url and 'youtube.com/channel/' in url:
|
||||
channel_id = re.search('youtube.com/channel/([-_\w]+)', url).group(1)
|
||||
url = "https://www.youtube.com/feeds/videos.xml?channel_id=%s" % channel_id
|
||||
without_rss = True
|
||||
if url and 'youtube.com/feeds' in url:
|
||||
without_rss = True
|
||||
if url and 'youtube.com/playlist' in url:
|
||||
without_rss = True
|
||||
|
||||
def criteria(key, value):
|
||||
if aggressive:
|
||||
|
@ -389,34 +420,44 @@ class Feed(models.Model):
|
|||
|
||||
# Normalize and check for feed_address, dupes, and feed_link
|
||||
url = urlnorm.normalize(url)
|
||||
if not url:
|
||||
return
|
||||
|
||||
feed = by_url(url)
|
||||
found_feed_urls = []
|
||||
|
||||
# Create if it looks good
|
||||
if feed and len(feed) > offset:
|
||||
feed = feed[offset]
|
||||
elif create:
|
||||
create_okay = False
|
||||
if feedfinder.isFeed(url):
|
||||
create_okay = True
|
||||
elif fetch:
|
||||
# Could still be a feed. Just check if there are entries
|
||||
fp = feedparser.parse(url)
|
||||
if len(fp.entries):
|
||||
create_okay = True
|
||||
if create_okay:
|
||||
feed = cls.objects.create(feed_address=url)
|
||||
feed = feed.update()
|
||||
|
||||
# Still nothing? Maybe the URL has some clues.
|
||||
if not feed and fetch:
|
||||
feed_finder_url = feedfinder.feed(url)
|
||||
if feed_finder_url and 'comments' not in feed_finder_url:
|
||||
else:
|
||||
found_feed_urls = feedfinder.find_feeds(url)
|
||||
if len(found_feed_urls):
|
||||
feed_finder_url = found_feed_urls[0]
|
||||
logging.debug(" ---> Found feed URLs for %s: %s" % (url, found_feed_urls))
|
||||
feed = by_url(feed_finder_url)
|
||||
if not feed and create:
|
||||
if feed and len(feed) > offset:
|
||||
feed = feed[offset]
|
||||
logging.debug(" ---> Feed exists (%s), updating..." % (feed))
|
||||
feed = feed.update()
|
||||
elif create:
|
||||
logging.debug(" ---> Feed doesn't exist, creating: %s" % (feed_finder_url))
|
||||
feed = cls.objects.create(feed_address=feed_finder_url)
|
||||
feed = feed.update()
|
||||
elif feed and len(feed) > offset:
|
||||
feed = feed[offset]
|
||||
elif without_rss:
|
||||
logging.debug(" ---> Found without_rss feed: %s" % (url))
|
||||
feed = cls.objects.create(feed_address=url)
|
||||
feed = feed.update(requesting_user_id=user.pk if user else None)
|
||||
|
||||
|
||||
# Still nothing? Maybe the URL has some clues.
|
||||
if not feed and fetch and len(found_feed_urls):
|
||||
feed_finder_url = found_feed_urls[0]
|
||||
feed = by_url(feed_finder_url)
|
||||
if not feed and create:
|
||||
feed = cls.objects.create(feed_address=feed_finder_url)
|
||||
feed = feed.update()
|
||||
elif feed and len(feed) > offset:
|
||||
feed = feed[offset]
|
||||
|
||||
# Not created and not within bounds, so toss results.
|
||||
if isinstance(feed, QuerySet):
|
||||
|
@ -518,27 +559,31 @@ class Feed(models.Model):
|
|||
def _1():
|
||||
feed_address = None
|
||||
feed = self
|
||||
found_feed_urls = []
|
||||
try:
|
||||
is_feed = feedfinder.isFeed(self.feed_address)
|
||||
logging.debug(" ---> Checking: %s" % self.feed_address)
|
||||
found_feed_urls = feedfinder.find_feeds(self.feed_address)
|
||||
if found_feed_urls:
|
||||
feed_address = found_feed_urls[0]
|
||||
except KeyError:
|
||||
is_feed = False
|
||||
if not is_feed:
|
||||
feed_address = feedfinder.feed(self.feed_address)
|
||||
if not feed_address and self.feed_link:
|
||||
feed_address = feedfinder.feed(self.feed_link)
|
||||
else:
|
||||
feed_address_from_link = feedfinder.feed(self.feed_link)
|
||||
if feed_address_from_link != self.feed_address:
|
||||
feed_address = feed_address_from_link
|
||||
pass
|
||||
if not len(found_feed_urls) and self.feed_link:
|
||||
found_feed_urls = feedfinder.find_feeds(self.feed_link)
|
||||
if len(found_feed_urls) and found_feed_urls[0] != self.feed_address:
|
||||
feed_address = found_feed_urls[0]
|
||||
|
||||
if feed_address:
|
||||
if (feed_address.endswith('feedburner.com/atom.xml') or
|
||||
feed_address.endswith('feedburner.com/feed/')):
|
||||
logging.debug(" ---> Feed points to 'Wierdo', ignoring.")
|
||||
if any(ignored_domain in feed_address for ignored_domain in [
|
||||
'feedburner.com/atom.xml',
|
||||
'feedburner.com/feed/',
|
||||
'feedsportal.com',
|
||||
]):
|
||||
logging.debug(" ---> Feed points to 'Wierdo' or 'feedsportal', ignoring.")
|
||||
return False, self
|
||||
try:
|
||||
self.feed_address = feed_address
|
||||
feed = self.save()
|
||||
feed.count_subscribers()
|
||||
feed.schedule_feed_fetch_immediately()
|
||||
feed.has_feed_exception = False
|
||||
feed.active = True
|
||||
|
@ -597,7 +642,6 @@ class Feed(models.Model):
|
|||
self.save()
|
||||
|
||||
def count_errors_in_history(self, exception_type='feed', status_code=None, fetch_history=None):
|
||||
logging.debug(' ---> [%-30s] Counting errors in history...' % (unicode(self)[:30]))
|
||||
if not fetch_history:
|
||||
fetch_history = MFetchHistory.feed(self.pk)
|
||||
fh = fetch_history[exception_type + '_fetch_history']
|
||||
|
@ -622,6 +666,9 @@ class Feed(models.Model):
|
|||
self.has_page_exception = False
|
||||
self.save()
|
||||
|
||||
logging.debug(' ---> [%-30s] ~FBCounting any errors in history: %s (%s non errors)' %
|
||||
(unicode(self)[:30], len(errors), len(non_errors)))
|
||||
|
||||
return errors, non_errors
|
||||
|
||||
def count_redirects_in_history(self, fetch_type='feed', fetch_history=None):
|
||||
|
@ -648,7 +695,8 @@ class Feed(models.Model):
|
|||
r = redis.Redis(connection_pool=settings.REDIS_FEED_SUB_POOL)
|
||||
total_key = "s:%s" % self.original_feed_id
|
||||
premium_key = "sp:%s" % self.original_feed_id
|
||||
last_recount = r.zscore(total_key, -1)
|
||||
last_recount = r.zscore(total_key, -1) # Need to subtract this extra when counting subs
|
||||
last_recount = r.zscore(premium_key, -1) # Need to subtract this extra when counting subs
|
||||
|
||||
# Check for expired feeds with no active users who would have triggered a cleanup
|
||||
if last_recount and last_recount > subscriber_expire:
|
||||
|
@ -694,18 +742,18 @@ class Feed(models.Model):
|
|||
|
||||
results = pipeline.execute()
|
||||
|
||||
# -1 due to key=-1 signaling counts_converted_to_redis
|
||||
total += results[0] - 1
|
||||
active += results[1] - 1
|
||||
premium += results[2] - 1
|
||||
active_premium += results[3] - 1
|
||||
# -1 due to counts_converted_to_redis using key=-1 for last_recount date
|
||||
total += max(0, results[0] - 1)
|
||||
active += max(0, results[1] - 1)
|
||||
premium += max(0, results[2] - 1)
|
||||
active_premium += max(0, results[3] - 1)
|
||||
|
||||
original_num_subscribers = self.num_subscribers
|
||||
original_active_subs = self.active_subscribers
|
||||
original_premium_subscribers = self.premium_subscribers
|
||||
original_active_premium_subscribers = self.active_premium_subscribers
|
||||
logging.info(" ---> [%-30s] ~SN~FBCounting subscribers from ~FCredis~FB: ~FMt:~SB~FM%s~SN a:~SB%s~SN p:~SB%s~SN ap:~SB%s" %
|
||||
(self.title[:30], total, active, premium, active_premium))
|
||||
logging.info(" ---> [%-30s] ~SN~FBCounting subscribers from ~FCredis~FB: ~FMt:~SB~FM%s~SN a:~SB%s~SN p:~SB%s~SN ap:~SB%s ~SN~FC%s" %
|
||||
(self.title[:30], total, active, premium, active_premium, "(%s branches)" % (len(feed_ids)-1) if len(feed_ids)>1 else ""))
|
||||
else:
|
||||
from apps.reader.models import UserSubscription
|
||||
|
||||
|
@ -749,8 +797,11 @@ class Feed(models.Model):
|
|||
self.active_subscribers != original_active_subs or
|
||||
self.premium_subscribers != original_premium_subscribers or
|
||||
self.active_premium_subscribers != original_active_premium_subscribers):
|
||||
self.save(update_fields=['num_subscribers', 'active_subscribers',
|
||||
'premium_subscribers', 'active_premium_subscribers'])
|
||||
if original_premium_subscribers == -1 or original_active_premium_subscribers == -1:
|
||||
self.save()
|
||||
else:
|
||||
self.save(update_fields=['num_subscribers', 'active_subscribers',
|
||||
'premium_subscribers', 'active_premium_subscribers'])
|
||||
|
||||
if verbose:
|
||||
if self.num_subscribers <= 1:
|
||||
|
@ -866,28 +917,28 @@ class Feed(models.Model):
|
|||
map_f = """
|
||||
function() {
|
||||
var date = (this.story_date.getFullYear()) + "-" + (this.story_date.getMonth()+1);
|
||||
emit(date, 1);
|
||||
var hour = this.story_date.getUTCHours();
|
||||
var day = this.story_date.getDay();
|
||||
emit(this.story_hash, {'month': date, 'hour': hour, 'day': day});
|
||||
}
|
||||
"""
|
||||
reduce_f = """
|
||||
function(key, values) {
|
||||
var total = 0;
|
||||
for (var i=0; i < values.length; i++) {
|
||||
total += values[i];
|
||||
}
|
||||
return total;
|
||||
return values;
|
||||
}
|
||||
"""
|
||||
dates = {}
|
||||
res = MStory.objects(story_feed_id=self.pk).map_reduce(map_f, reduce_f, output='inline')
|
||||
for r in res:
|
||||
dates[r.key] = r.value
|
||||
year_found = re.findall(r"(\d{4})-\d{1,2}", r.key)
|
||||
if year_found and len(year_found):
|
||||
year = int(year_found[0])
|
||||
if year < min_year and year > 2000:
|
||||
min_year = year
|
||||
|
||||
dates = defaultdict(int)
|
||||
hours = defaultdict(int)
|
||||
days = defaultdict(int)
|
||||
results = MStory.objects(story_feed_id=self.pk).map_reduce(map_f, reduce_f, output='inline')
|
||||
for result in results:
|
||||
dates[result.value['month']] += 1
|
||||
hours[int(result.value['hour'])] += 1
|
||||
days[int(result.value['day'])] += 1
|
||||
year = int(re.findall(r"(\d{4})-\d{1,2}", result.value['month'])[0])
|
||||
if year < min_year and year > 2000:
|
||||
min_year = year
|
||||
|
||||
# Add on to existing months, always amending up, never down. (Current month
|
||||
# is guaranteed to be accurate, since trim_feeds won't delete it until after
|
||||
# a month. Hacker News can have 1,000+ and still be counted.)
|
||||
|
@ -912,7 +963,7 @@ class Feed(models.Model):
|
|||
total += dates.get(key, 0)
|
||||
month_count += 1
|
||||
original_story_count_history = self.data.story_count_history
|
||||
self.data.story_count_history = json.encode(months)
|
||||
self.data.story_count_history = json.encode({'months': months, 'hours': hours, 'days': days})
|
||||
if self.data.story_count_history != original_story_count_history:
|
||||
self.data.save(update_fields=['story_count_history'])
|
||||
|
||||
|
@ -978,7 +1029,7 @@ class Feed(models.Model):
|
|||
from utils import feed_fetcher
|
||||
r = redis.Redis(connection_pool=settings.REDIS_FEED_UPDATE_POOL)
|
||||
original_feed_id = int(self.pk)
|
||||
|
||||
|
||||
if getattr(settings, 'TEST_DEBUG', False):
|
||||
original_feed_address = self.feed_address
|
||||
original_feed_link = self.feed_link
|
||||
|
@ -1001,10 +1052,14 @@ class Feed(models.Model):
|
|||
'debug': kwargs.get('debug'),
|
||||
'fpf': kwargs.get('fpf'),
|
||||
'feed_xml': kwargs.get('feed_xml'),
|
||||
'requesting_user_id': kwargs.get('requesting_user_id', None)
|
||||
}
|
||||
disp = feed_fetcher.Dispatcher(options, 1)
|
||||
disp.add_jobs([[self.pk]])
|
||||
feed = disp.run_jobs()
|
||||
if self.is_newsletter:
|
||||
feed = self.update_newsletter_icon()
|
||||
else:
|
||||
disp = feed_fetcher.Dispatcher(options, 1)
|
||||
disp.add_jobs([[self.pk]])
|
||||
feed = disp.run_jobs()
|
||||
|
||||
if feed:
|
||||
feed = Feed.get_by_id(feed.pk)
|
||||
|
@ -1022,7 +1077,14 @@ class Feed(models.Model):
|
|||
r.zrem('error_feeds', feed.pk)
|
||||
|
||||
return feed
|
||||
|
||||
|
||||
def update_newsletter_icon(self):
|
||||
from apps.rss_feeds.icon_importer import IconImporter
|
||||
icon_importer = IconImporter(self)
|
||||
icon_importer.save()
|
||||
|
||||
return self
|
||||
|
||||
@classmethod
|
||||
def get_by_id(cls, feed_id, feed_address=None):
|
||||
try:
|
||||
|
@ -1167,6 +1229,7 @@ class Feed(models.Model):
|
|||
existing_story.story_permalink = story_link
|
||||
existing_story.story_guid = story.get('guid')
|
||||
existing_story.story_tags = story_tags
|
||||
existing_story.original_text_z = None # Reset Text view cache
|
||||
# Do not allow publishers to change the story date once a story is published.
|
||||
# Leads to incorrect unread story counts.
|
||||
if replace_story_date:
|
||||
|
@ -1261,11 +1324,11 @@ class Feed(models.Model):
|
|||
self.save_popular_authors(feed_authors=feed_authors[:-1])
|
||||
|
||||
@classmethod
|
||||
def trim_old_stories(cls, start=0, verbose=True, dryrun=False):
|
||||
def trim_old_stories(cls, start=0, verbose=True, dryrun=False, total=0):
|
||||
now = datetime.datetime.now()
|
||||
month_ago = now - datetime.timedelta(days=settings.DAYS_OF_STORY_HASHES)
|
||||
feed_count = Feed.objects.latest('pk').pk
|
||||
total = 0
|
||||
|
||||
for feed_id in xrange(start, feed_count):
|
||||
if feed_id % 1000 == 0:
|
||||
print "\n\n -------------------------- %s (%s deleted so far) --------------------------\n\n" % (feed_id, total)
|
||||
|
@ -1273,9 +1336,7 @@ class Feed(models.Model):
|
|||
feed = Feed.objects.get(pk=feed_id)
|
||||
except Feed.DoesNotExist:
|
||||
continue
|
||||
if feed.active_subscribers > 0:
|
||||
continue
|
||||
if not feed.last_story_date or feed.last_story_date < month_ago:
|
||||
if feed.active_subscribers <= 0 and (not feed.last_story_date or feed.last_story_date < month_ago):
|
||||
months_ago = 6
|
||||
if feed.last_story_date:
|
||||
months_ago = int((now - feed.last_story_date).days / 30.0)
|
||||
|
@ -1284,6 +1345,12 @@ class Feed(models.Model):
|
|||
print " DRYRUN: %s cutoff - %s" % (cutoff, feed)
|
||||
else:
|
||||
total += MStory.trim_feed(feed=feed, cutoff=cutoff, verbose=verbose)
|
||||
else:
|
||||
if dryrun:
|
||||
print " DRYRUN: %s/%s cutoff - %s" % (cutoff, feed.story_cutoff, feed)
|
||||
else:
|
||||
total += feed.trim_feed(verbose=verbose)
|
||||
|
||||
|
||||
print " ---> Deleted %s stories in total." % total
|
||||
|
||||
|
@ -1304,19 +1371,56 @@ class Feed(models.Model):
|
|||
cutoff = 400
|
||||
elif self.active_premium_subscribers <= 20:
|
||||
cutoff = 450
|
||||
|
||||
|
||||
if self.active_subscribers and self.average_stories_per_month < 5 and self.stories_last_month < 5:
|
||||
cutoff /= 2
|
||||
if self.active_premium_subscribers <= 1 and self.average_stories_per_month <= 1 and self.stories_last_month <= 1:
|
||||
cutoff /= 2
|
||||
|
||||
r = redis.Redis(connection_pool=settings.REDIS_FEED_READ_POOL)
|
||||
pipeline = r.pipeline()
|
||||
read_stories_per_week = []
|
||||
now = datetime.datetime.now()
|
||||
for weeks_back in range(2*int(math.floor(settings.DAYS_OF_STORY_HASHES/7))):
|
||||
weeks_ago = now - datetime.timedelta(days=7*weeks_back)
|
||||
week_of_year = weeks_ago.strftime('%Y-%U')
|
||||
feed_read_key = "fR:%s:%s" % (self.pk, week_of_year)
|
||||
pipeline.get(feed_read_key)
|
||||
read_stories_per_week = pipeline.execute()
|
||||
read_stories_last_month = sum([int(rs) for rs in read_stories_per_week if rs])
|
||||
if read_stories_last_month == 0:
|
||||
original_cutoff = cutoff
|
||||
cutoff = min(cutoff, 10)
|
||||
try:
|
||||
logging.debug(" ---> [%-30s] ~FBTrimming down to ~SB%s (instead of %s)~SN stories (~FM%s~FB)" % (self, cutoff, original_cutoff, self.last_story_date.strftime("%Y-%m-%d") if self.last_story_date else "No last story date"))
|
||||
except ValueError, e:
|
||||
logging.debug(" ***> [%-30s] Error trimming: %s" % (self, e))
|
||||
pass
|
||||
|
||||
return cutoff
|
||||
|
||||
def trim_feed(self, verbose=False, cutoff=None):
|
||||
if not cutoff:
|
||||
cutoff = self.story_cutoff
|
||||
MStory.trim_feed(feed=self, cutoff=cutoff, verbose=verbose)
|
||||
return MStory.trim_feed(feed=self, cutoff=cutoff, verbose=verbose)
|
||||
|
||||
def purge_feed_stories(self, update=True):
|
||||
MStory.purge_feed_stories(feed=self, cutoff=self.story_cutoff)
|
||||
if update:
|
||||
self.update()
|
||||
|
||||
def purge_author(self, author):
|
||||
all_stories = MStory.objects.filter(story_feed_id=self.pk)
|
||||
author_stories = MStory.objects.filter(story_feed_id=self.pk, story_author_name__iexact=author)
|
||||
logging.debug(" ---> Deleting %s of %s stories in %s by '%s'." % (author_stories.count(), all_stories.count(), self, author))
|
||||
author_stories.delete()
|
||||
|
||||
def purge_tag(self, tag):
|
||||
all_stories = MStory.objects.filter(story_feed_id=self.pk)
|
||||
tagged_stories = MStory.objects.filter(story_feed_id=self.pk, story_tags__icontains=tag)
|
||||
logging.debug(" ---> Deleting %s of %s stories in %s by '%s'." % (tagged_stories.count(), all_stories.count(), self, tag))
|
||||
tagged_stories.delete()
|
||||
|
||||
# @staticmethod
|
||||
# def clean_invalid_ids():
|
||||
# history = MFeedFetchHistory.objects(status_code=500, exception__contains='InvalidId:')
|
||||
|
@ -1345,6 +1449,179 @@ class Feed(models.Model):
|
|||
stories = cls.format_stories(stories_db)
|
||||
|
||||
return stories
|
||||
|
||||
@classmethod
|
||||
def query_popularity(cls, query, limit, order='newest'):
|
||||
popularity = {}
|
||||
seen_feeds = set()
|
||||
feed_title_to_id = dict()
|
||||
|
||||
# Collect stories, sort by feed
|
||||
story_ids = SearchStory.global_query(query, order=order, offset=0, limit=limit)
|
||||
for story_hash in story_ids:
|
||||
feed_id, story_id = MStory.split_story_hash(story_hash)
|
||||
feed = Feed.get_by_id(feed_id)
|
||||
if not feed: continue
|
||||
if feed.feed_title in seen_feeds:
|
||||
feed_id = feed_title_to_id[feed.feed_title]
|
||||
else:
|
||||
feed_title_to_id[feed.feed_title] = feed_id
|
||||
seen_feeds.add(feed.feed_title)
|
||||
if feed_id not in popularity:
|
||||
well_read_score = feed.well_read_score()
|
||||
popularity[feed_id] = {
|
||||
'feed_title': feed.feed_title,
|
||||
'feed_url': feed.feed_link,
|
||||
'num_subscribers': feed.num_subscribers,
|
||||
'feed_id': feed.pk,
|
||||
'story_ids': [],
|
||||
'authors': {},
|
||||
'read_pct': well_read_score['read_pct'],
|
||||
'reader_count': well_read_score['reader_count'],
|
||||
'story_count': well_read_score['story_count'],
|
||||
'reach_score': well_read_score['reach_score']
|
||||
}
|
||||
popularity[feed_id]['story_ids'].append(story_hash)
|
||||
|
||||
sorted_popularity = sorted(popularity.values(), key=lambda x: x['reach_score'],
|
||||
reverse=True)
|
||||
|
||||
# Extract story authors from feeds
|
||||
for feed in sorted_popularity:
|
||||
story_ids = feed['story_ids']
|
||||
stories_db = MStory.objects(story_hash__in=story_ids)
|
||||
stories = cls.format_stories(stories_db)
|
||||
for story in stories:
|
||||
story['story_permalink'] = story['story_permalink'][:250]
|
||||
if story['story_authors'] not in feed['authors']:
|
||||
feed['authors'][story['story_authors']] = {
|
||||
'name': story['story_authors'],
|
||||
'count': 0,
|
||||
'tags': {},
|
||||
'stories': [],
|
||||
}
|
||||
authors = feed['authors'][story['story_authors']]
|
||||
seen = False
|
||||
for seen_story in authors['stories']:
|
||||
if seen_story['url'] == story['story_permalink']:
|
||||
seen = True
|
||||
break
|
||||
else:
|
||||
authors['stories'].append({
|
||||
'title': story['story_title'],
|
||||
'url': story['story_permalink'],
|
||||
'date': story['story_date'],
|
||||
})
|
||||
authors['count'] += 1
|
||||
if seen: continue # Don't recount tags
|
||||
for tag in story['story_tags']:
|
||||
if tag not in authors['tags']:
|
||||
authors['tags'][tag] = 0
|
||||
authors['tags'][tag] += 1
|
||||
sorted_authors = sorted(feed['authors'].values(), key=lambda x: x['count'])
|
||||
feed['authors'] = sorted_authors
|
||||
|
||||
# pprint(sorted_popularity)
|
||||
return sorted_popularity
|
||||
|
||||
def well_read_score(self):
|
||||
from apps.reader.models import UserSubscription
|
||||
|
||||
# Average percentage of stories read vs published across recently active subscribers
|
||||
r = redis.Redis(connection_pool=settings.REDIS_STORY_HASH_POOL)
|
||||
p = r.pipeline()
|
||||
|
||||
subscribing_users = UserSubscription.objects.filter(feed_id=self.pk).values('user_id')
|
||||
subscribing_user_ids = [sub['user_id'] for sub in subscribing_users]
|
||||
|
||||
for user_id in subscribing_user_ids:
|
||||
user_rs = "RS:%s:%s" % (user_id, self.pk)
|
||||
p.scard(user_rs)
|
||||
|
||||
counts = p.execute()
|
||||
counts = [c for c in counts if c > 0]
|
||||
reader_count = len(counts)
|
||||
|
||||
story_count = MStory.objects(story_feed_id=self.pk,
|
||||
story_date__gte=self.unread_cutoff).count()
|
||||
if reader_count and story_count:
|
||||
average_pct = (sum(counts) / float(reader_count)) / float(story_count)
|
||||
else:
|
||||
average_pct = 0
|
||||
|
||||
reach_score = average_pct * reader_count * story_count
|
||||
|
||||
return {'read_pct': average_pct, 'reader_count': reader_count,
|
||||
'reach_score': reach_score, 'story_count': story_count}
|
||||
|
||||
@classmethod
|
||||
def xls_query_popularity(cls, queries, limit):
|
||||
import xlsxwriter
|
||||
workbook = xlsxwriter.Workbook('NewsBlurPopularity.xlsx')
|
||||
bold = workbook.add_format({'bold': 1})
|
||||
date_format = workbook.add_format({'num_format': 'mmm d yyyy'})
|
||||
unread_format = workbook.add_format({'font_color': '#E0E0E0'})
|
||||
if isinstance(queries, str):
|
||||
queries = [q.strip() for q in queries.split(',')]
|
||||
|
||||
for query in queries:
|
||||
worksheet = workbook.add_worksheet(query)
|
||||
row = 1
|
||||
col = 0
|
||||
worksheet.write(0, col, 'Feed', bold)
|
||||
worksheet.write(0, col+1, 'Feed URL', bold)
|
||||
worksheet.write(0, col+2, '# Subs', bold)
|
||||
worksheet.write(0, col+4, '# Readers', bold)
|
||||
worksheet.write(0, col+3, 'Reach score', bold)
|
||||
worksheet.write(0, col+5, 'Read %', bold)
|
||||
worksheet.write(0, col+6, '# stories 30d', bold)
|
||||
worksheet.write(0, col+7, 'Author', bold)
|
||||
worksheet.write(0, col+8, 'Story Title', bold)
|
||||
worksheet.write(0, col+9, 'Story URL', bold)
|
||||
worksheet.write(0, col+10, 'Story Date', bold)
|
||||
worksheet.write(0, col+11, 'Tag', bold)
|
||||
worksheet.write(0, col+12, 'Tag Count', bold)
|
||||
worksheet.set_column(col, col, 15)
|
||||
worksheet.set_column(col+1, col+1, 20)
|
||||
worksheet.set_column(col+2, col+2, 8)
|
||||
worksheet.set_column(col+3, col+3, 8)
|
||||
worksheet.set_column(col+4, col+4, 8)
|
||||
worksheet.set_column(col+5, col+5, 8)
|
||||
worksheet.set_column(col+6, col+6, 8)
|
||||
worksheet.set_column(col+7, col+7, 15)
|
||||
worksheet.set_column(col+8, col+8, 30)
|
||||
worksheet.set_column(col+9, col+9, 20)
|
||||
worksheet.set_column(col+10, col+10, 10)
|
||||
worksheet.set_column(col+11, col+11, 15)
|
||||
worksheet.set_column(col+12, col+12, 8)
|
||||
popularity = cls.query_popularity(query, limit=limit)
|
||||
|
||||
worksheet.write(row, col, query)
|
||||
for feed in popularity:
|
||||
worksheet.write(row, col+0, feed['feed_title'])
|
||||
worksheet.write_url(row, col+1, feed['feed_url'])
|
||||
worksheet.write(row, col+2, feed['num_subscribers'])
|
||||
worksheet.write(row, col+4, feed['reader_count'])
|
||||
worksheet.write(row, col+3, feed['reach_score'])
|
||||
worksheet.write(row, col+5, feed['read_pct'])
|
||||
worksheet.write(row, col+6, feed['story_count'])
|
||||
worksheet.conditional_format(row, col+3, row, col+6, {'type': 'cell',
|
||||
'criteria': '==',
|
||||
'value': 0,
|
||||
'format': unread_format})
|
||||
for author in feed['authors']:
|
||||
worksheet.write(row, col+7, author['name'])
|
||||
for story in author['stories']:
|
||||
worksheet.write(row, col+8, story['title'])
|
||||
worksheet.write_url(row, col+9, story['url'])
|
||||
worksheet.write_datetime(row, col+10, story['date'], date_format)
|
||||
row += 1
|
||||
for tag, count in author['tags'].items():
|
||||
worksheet.write(row, col+11, tag)
|
||||
worksheet.write(row, col+12, count)
|
||||
row += 1
|
||||
|
||||
workbook.close()
|
||||
|
||||
def find_stories(self, query, order="newest", offset=0, limit=25):
|
||||
story_ids = SearchStory.query(feed_ids=[self.pk], query=query, order=order,
|
||||
|
@ -1368,11 +1645,26 @@ class Feed(models.Model):
|
|||
return stories
|
||||
|
||||
@classmethod
|
||||
def format_story(cls, story_db, feed_id=None, text=False, include_permalinks=False):
|
||||
def format_story(cls, story_db, feed_id=None, text=False, include_permalinks=False,
|
||||
show_changes=False):
|
||||
if isinstance(story_db.story_content_z, unicode):
|
||||
story_db.story_content_z = story_db.story_content_z.decode('base64')
|
||||
|
||||
story_content = ''
|
||||
latest_story_content = None
|
||||
has_changes = False
|
||||
if (not show_changes and
|
||||
hasattr(story_db, 'story_latest_content_z') and
|
||||
story_db.story_latest_content_z):
|
||||
latest_story_content = smart_unicode(zlib.decompress(story_db.story_latest_content_z))
|
||||
if story_db.story_content_z:
|
||||
story_content = smart_unicode(zlib.decompress(story_db.story_content_z))
|
||||
|
||||
if '<ins' in story_content or '<del' in story_content:
|
||||
has_changes = True
|
||||
if not show_changes and latest_story_content:
|
||||
story_content = latest_story_content
|
||||
|
||||
story_content = story_db.story_content_z and zlib.decompress(story_db.story_content_z) or ''
|
||||
story = {}
|
||||
story['story_hash'] = getattr(story_db, 'story_hash', None)
|
||||
story['story_tags'] = story_db.story_tags or []
|
||||
|
@ -1384,6 +1676,7 @@ class Feed(models.Model):
|
|||
story['story_permalink'] = story_db.story_permalink
|
||||
story['image_urls'] = story_db.image_urls
|
||||
story['story_feed_id'] = feed_id or story_db.story_feed_id
|
||||
story['has_modifications']= has_changes
|
||||
story['comment_count'] = story_db.comment_count if hasattr(story_db, 'comment_count') else 0
|
||||
story['comment_user_ids'] = story_db.comment_user_ids if hasattr(story_db, 'comment_user_ids') else []
|
||||
story['share_count'] = story_db.share_count if hasattr(story_db, 'share_count') else 0
|
||||
|
@ -1410,8 +1703,6 @@ class Feed(models.Model):
|
|||
text = re.sub(r'\n+', '\n\n', text)
|
||||
text = re.sub(r'\t+', '\t', text)
|
||||
story['text'] = text
|
||||
if '<ins' in story['story_content'] or '<del' in story['story_content']:
|
||||
story['has_modifications'] = True
|
||||
|
||||
return story
|
||||
|
||||
|
@ -1595,6 +1886,8 @@ class Feed(models.Model):
|
|||
elif spd == 0:
|
||||
if subs > 1:
|
||||
total = 60 * 6
|
||||
elif subs == 1:
|
||||
total = 60 * 12
|
||||
else:
|
||||
total = 60 * 24
|
||||
months_since_last_story = seconds_timesince(self.last_story_date) / (60*60*24*30)
|
||||
|
@ -1621,8 +1914,11 @@ class Feed(models.Model):
|
|||
if len(fetch_history['push_history']):
|
||||
total = total * 12
|
||||
|
||||
# 2 day max
|
||||
total = min(total, 60*24*2)
|
||||
# 12 hour max for premiums, 48 hour max for free
|
||||
if subs >= 1:
|
||||
total = min(total, 60*12*1)
|
||||
else:
|
||||
total = min(total, 60*24*2)
|
||||
|
||||
if verbose:
|
||||
logging.debug(" ---> [%-30s] Fetched every %s min - Subs: %s/%s/%s Stories/day: %s" % (
|
||||
|
@ -1675,6 +1971,10 @@ class Feed(models.Model):
|
|||
|
||||
def schedule_feed_fetch_immediately(self, verbose=True):
|
||||
r = redis.Redis(connection_pool=settings.REDIS_FEED_UPDATE_POOL)
|
||||
if not self.num_subscribers:
|
||||
logging.debug(' ---> [%-30s] Not scheduling feed fetch immediately, no subs.' % (unicode(self)[:30]))
|
||||
return
|
||||
|
||||
if verbose:
|
||||
logging.debug(' ---> [%-30s] Scheduling feed fetch immediately...' % (unicode(self)[:30]))
|
||||
|
||||
|
@ -1761,6 +2061,10 @@ class FeedData(models.Model):
|
|||
super(FeedData, self).save(*args, **kwargs)
|
||||
except (IntegrityError, OperationError):
|
||||
if hasattr(self, 'id') and self.id: self.delete()
|
||||
except DatabaseError, e:
|
||||
# Nothing updated
|
||||
logging.debug(" ---> ~FRNothing updated in FeedData (%s): %s" % (self.feed, e))
|
||||
pass
|
||||
|
||||
|
||||
class MFeedIcon(mongo.Document):
|
||||
|
@ -1809,9 +2113,12 @@ class MFeedPage(mongo.Document):
|
|||
|
||||
def save(self, *args, **kwargs):
|
||||
if self.page_data:
|
||||
self.page_data = zlib.compress(self.page_data)
|
||||
self.page_data = zlib.compress(self.page_data).decode('utf-8')
|
||||
return super(MFeedPage, self).save(*args, **kwargs)
|
||||
|
||||
def page(self):
|
||||
return zlib.decompress(self.page_data)
|
||||
|
||||
@classmethod
|
||||
def get_data(cls, feed_id):
|
||||
data = None
|
||||
|
@ -1864,10 +2171,10 @@ class MStory(mongo.Document):
|
|||
{'fields': ['story_hash'],
|
||||
'unique': True,
|
||||
'types': False, }],
|
||||
'index_drop_dups': True,
|
||||
'ordering': ['-story_date'],
|
||||
'allow_inheritance': False,
|
||||
'cascade': False,
|
||||
'strict': False,
|
||||
}
|
||||
|
||||
RE_STORY_HASH = re.compile(r"^(\d{1,10}):(\w{6})$")
|
||||
|
@ -1924,7 +2231,16 @@ class MStory(mongo.Document):
|
|||
self.remove_from_search_index()
|
||||
|
||||
super(MStory, self).delete(*args, **kwargs)
|
||||
|
||||
|
||||
@classmethod
|
||||
def purge_feed_stories(cls, feed, cutoff, verbose=True):
|
||||
stories = cls.objects(story_feed_id=feed.pk)
|
||||
logging.debug(" ---> Deleting %s stories from %s" % (stories.count(), feed))
|
||||
if stories.count() > cutoff*1.25:
|
||||
logging.debug(" ***> ~FRToo many stories in %s, not purging..." % (feed))
|
||||
return
|
||||
stories.delete()
|
||||
|
||||
@classmethod
|
||||
def index_all_for_search(cls, offset=0):
|
||||
if not offset:
|
||||
|
@ -1959,7 +2275,7 @@ class MStory(mongo.Document):
|
|||
SearchStory.remove(self.story_hash)
|
||||
except NotFoundException:
|
||||
pass
|
||||
|
||||
|
||||
@classmethod
|
||||
def trim_feed(cls, cutoff, feed_id=None, feed=None, verbose=True):
|
||||
extra_stories_count = 0
|
||||
|
@ -1974,7 +2290,7 @@ class MStory(mongo.Document):
|
|||
stories = cls.objects(
|
||||
story_feed_id=feed_id
|
||||
).only('story_date').order_by('-story_date')
|
||||
|
||||
|
||||
if stories.count() > cutoff:
|
||||
logging.debug(' ---> [%-30s] ~FMFound %s stories. Trimming to ~SB%s~SN...' %
|
||||
(unicode(feed)[:30], stories.count(), cutoff))
|
||||
|
@ -1991,6 +2307,7 @@ class MStory(mongo.Document):
|
|||
for story in extra_stories:
|
||||
if story.share_count:
|
||||
shared_story_count += 1
|
||||
extra_stories_count -= 1
|
||||
continue
|
||||
story.delete()
|
||||
if verbose:
|
||||
|
@ -2220,7 +2537,7 @@ class MStory(mongo.Document):
|
|||
return original_page
|
||||
|
||||
|
||||
class MStarredStory(mongo.Document):
|
||||
class MStarredStory(mongo.DynamicDocument):
|
||||
"""Like MStory, but not inherited due to large overhead of _cls and _type in
|
||||
mongoengine's inheritance model on every single row."""
|
||||
user_id = mongo.IntField(unique_with=('story_guid',))
|
||||
|
@ -2244,10 +2561,11 @@ class MStarredStory(mongo.Document):
|
|||
|
||||
meta = {
|
||||
'collection': 'starred_stories',
|
||||
'indexes': [('user_id', '-starred_date'), ('user_id', 'story_feed_id'), 'story_feed_id'],
|
||||
'index_drop_dups': True,
|
||||
'indexes': [('user_id', '-starred_date'), ('user_id', 'story_feed_id'),
|
||||
('user_id', 'story_hash'), 'story_feed_id'],
|
||||
'ordering': ['-starred_date'],
|
||||
'allow_inheritance': False,
|
||||
'strict': False,
|
||||
}
|
||||
|
||||
def save(self, *args, **kwargs):
|
||||
|
@ -2302,7 +2620,7 @@ class MStarredStory(mongo.Document):
|
|||
},
|
||||
}])
|
||||
month_ago = datetime.datetime.now() - datetime.timedelta(days=days)
|
||||
user_ids = stats['result']
|
||||
user_ids = list(stats)
|
||||
user_ids = sorted(user_ids, key=lambda x:x['stories'], reverse=True)
|
||||
print " ---> Found %s users with more than %s starred stories" % (len(user_ids), stories)
|
||||
|
||||
|
@ -2419,8 +2737,11 @@ class MStarredStoryCounts(mongo.Document):
|
|||
|
||||
if not total_only:
|
||||
cls.objects(user_id=user_id).delete()
|
||||
user_tags = cls.count_tags_for_user(user_id)
|
||||
user_feeds = cls.count_feeds_for_user(user_id)
|
||||
try:
|
||||
user_tags = cls.count_tags_for_user(user_id)
|
||||
user_feeds = cls.count_feeds_for_user(user_id)
|
||||
except pymongo.errors.OperationFailure, e:
|
||||
logging.debug(" ---> ~FBOperationError on mongo: ~SB%s" % e)
|
||||
|
||||
total_stories_count = MStarredStory.objects(user_id=user_id).count()
|
||||
cls.objects(user_id=user_id, tag=None, feed_id=None).update_one(set__count=total_stories_count,
|
||||
|
@ -2535,8 +2856,9 @@ class MFetchHistory(mongo.Document):
|
|||
history = fetch_history.push_history or []
|
||||
|
||||
history = [[date, code, message]] + history
|
||||
if code and code >= 400:
|
||||
history = history[:50]
|
||||
any_exceptions = any([c for d, c, m in history if c not in [200, 304]])
|
||||
if any_exceptions:
|
||||
history = history[:25]
|
||||
else:
|
||||
history = history[:5]
|
||||
|
||||
|
@ -2597,7 +2919,7 @@ def merge_feeds(original_feed_id, duplicate_feed_id, force=False):
|
|||
return original_feed_id
|
||||
|
||||
heavier_dupe = original_feed.num_subscribers < duplicate_feed.num_subscribers
|
||||
branched_original = original_feed.branch_from_feed
|
||||
branched_original = original_feed.branch_from_feed and not duplicate_feed.branch_from_feed
|
||||
if (heavier_dupe or branched_original) and not force:
|
||||
original_feed, duplicate_feed = duplicate_feed, original_feed
|
||||
original_feed_id, duplicate_feed_id = duplicate_feed_id, original_feed_id
|
||||
|
|
|
@ -98,16 +98,16 @@ class PageImporter(object):
|
|||
logging.debug(' ***> [%-30s] Page fetch failed using requests: %s' % (self.feed, e))
|
||||
self.save_no_page()
|
||||
return
|
||||
try:
|
||||
data = response.text
|
||||
except (LookupError, TypeError):
|
||||
data = response.content
|
||||
# try:
|
||||
data = response.content
|
||||
# except (LookupError, TypeError):
|
||||
# data = response.content
|
||||
|
||||
if response.encoding and response.encoding != 'utf-8':
|
||||
try:
|
||||
data = data.encode(response.encoding)
|
||||
except LookupError:
|
||||
pass
|
||||
# if response.encoding and response.encoding != 'utf-8':
|
||||
# try:
|
||||
# data = data.encode(response.encoding)
|
||||
# except LookupError:
|
||||
# pass
|
||||
else:
|
||||
try:
|
||||
data = open(feed_link, 'r').read()
|
||||
|
@ -270,8 +270,12 @@ class PageImporter(object):
|
|||
if not saved:
|
||||
try:
|
||||
feed_page = MFeedPage.objects.get(feed_id=self.feed.pk)
|
||||
feed_page.page_data = html
|
||||
feed_page.save()
|
||||
# feed_page.page_data = html.encode('utf-8')
|
||||
if feed_page.page() == html:
|
||||
logging.debug(' ---> [%-30s] ~FYNo change in page data: %s' % (self.feed.title[:30], self.feed.feed_link))
|
||||
else:
|
||||
feed_page.page_data = html
|
||||
feed_page.save()
|
||||
except MFeedPage.DoesNotExist:
|
||||
feed_page = MFeedPage.objects.create(feed_id=self.feed.pk, page_data=html)
|
||||
return feed_page
|
||||
|
|
|
@ -4,6 +4,7 @@ import shutil
|
|||
import time
|
||||
import redis
|
||||
from celery.task import Task
|
||||
from celery.exceptions import SoftTimeLimitExceeded
|
||||
from utils import log as logging
|
||||
from utils import s3_utils as s3
|
||||
from django.conf import settings
|
||||
|
@ -29,6 +30,10 @@ class TaskFeeds(Task):
|
|||
now_timestamp = int(now.strftime("%s"))
|
||||
queued_feeds = r.zrangebyscore('scheduled_updates', 0, now_timestamp)
|
||||
r.zremrangebyscore('scheduled_updates', 0, now_timestamp)
|
||||
if not queued_feeds:
|
||||
logging.debug(" ---> ~SN~FB~BMNo feeds to queue! Exiting...")
|
||||
return
|
||||
|
||||
r.sadd('queued_feeds', *queued_feeds)
|
||||
logging.debug(" ---> ~SN~FBQueuing ~SB%s~SN stale feeds (~SB%s~SN/~FG%s~FB~SN/%s tasked/queued/scheduled)" % (
|
||||
len(queued_feeds),
|
||||
|
@ -124,6 +129,8 @@ class UpdateFeeds(Task):
|
|||
name = 'update-feeds'
|
||||
max_retries = 0
|
||||
ignore_result = True
|
||||
time_limit = 10*60
|
||||
soft_time_limit = 9*60
|
||||
|
||||
def run(self, feed_pks, **kwargs):
|
||||
from apps.rss_feeds.models import Feed
|
||||
|
@ -156,14 +163,21 @@ class UpdateFeeds(Task):
|
|||
if not feed or feed.pk != int(feed_pk):
|
||||
logging.info(" ---> ~FRRemoving feed_id %s from tasked_feeds queue, points to %s..." % (feed_pk, feed and feed.pk))
|
||||
r.zrem('tasked_feeds', feed_pk)
|
||||
if feed:
|
||||
if not feed:
|
||||
continue
|
||||
try:
|
||||
feed.update(**options)
|
||||
if profiler_activated: profiler.process_celery_finished()
|
||||
except SoftTimeLimitExceeded, e:
|
||||
feed.save_feed_history(505, 'Timeout', e)
|
||||
logging.info(" ---> [%-30s] ~BR~FWTime limit hit!~SB~FR Moving on to next feed..." % feed)
|
||||
if profiler_activated: profiler.process_celery_finished()
|
||||
|
||||
class NewFeeds(Task):
|
||||
name = 'new-feeds'
|
||||
max_retries = 0
|
||||
ignore_result = True
|
||||
time_limit = 10*60
|
||||
soft_time_limit = 9*60
|
||||
|
||||
def run(self, feed_pks, **kwargs):
|
||||
from apps.rss_feeds.models import Feed
|
||||
|
|
|
@ -4,6 +4,7 @@ from requests.packages.urllib3.exceptions import LocationParseError
|
|||
from socket import error as SocketError
|
||||
from mongoengine.queryset import NotUniqueError
|
||||
from vendor.readability import readability
|
||||
from lxml.etree import ParserError
|
||||
from utils import log as logging
|
||||
from utils.feed_functions import timelimit, TimeoutError
|
||||
from OpenSSL.SSL import Error as OpenSSLError
|
||||
|
@ -61,7 +62,8 @@ class TextImporter:
|
|||
positive_keywords=["postContent", "postField"])
|
||||
try:
|
||||
content = original_text_doc.summary(html_partial=True)
|
||||
except readability.Unparseable:
|
||||
except (readability.Unparseable, ParserError), e:
|
||||
logging.user(self.request, "~SN~FRFailed~FY to fetch ~FGoriginal text~FY: %s" % e)
|
||||
return
|
||||
|
||||
try:
|
||||
|
|
|
@ -15,4 +15,5 @@ urlpatterns = patterns('',
|
|||
url(r'^load_single_feed', views.load_single_feed, name='feed-canonical'),
|
||||
url(r'^original_text', views.original_text, name='original-text'),
|
||||
url(r'^original_story', views.original_story, name='original-story'),
|
||||
url(r'^story_changes', views.story_changes, name='story-changes'),
|
||||
)
|
||||
|
|
|
@ -16,15 +16,15 @@ from apps.analyzer.models import get_classifiers_for_user
|
|||
from apps.reader.models import UserSubscription
|
||||
from apps.rss_feeds.models import MStory
|
||||
from utils.user_functions import ajax_login_required
|
||||
from utils import json_functions as json, feedfinder
|
||||
from utils import json_functions as json, feedfinder2 as feedfinder
|
||||
from utils.feed_functions import relative_timeuntil, relative_timesince
|
||||
from utils.user_functions import get_user
|
||||
from utils.view_functions import get_argument_or_404
|
||||
from utils.view_functions import required_params
|
||||
from utils.view_functions import is_true
|
||||
from vendor.timezones.utilities import localtime_for_timezone
|
||||
from utils.ratelimit import ratelimit
|
||||
|
||||
|
||||
IGNORE_AUTOCOMPLETE = [
|
||||
"facebook.com/feeds/notifications.php",
|
||||
"inbox",
|
||||
|
@ -33,14 +33,19 @@ IGNORE_AUTOCOMPLETE = [
|
|||
"latitude",
|
||||
]
|
||||
|
||||
@ajax_login_required
|
||||
@json.json_view
|
||||
def search_feed(request):
|
||||
address = request.REQUEST.get('address')
|
||||
offset = int(request.REQUEST.get('offset', 0))
|
||||
if not address:
|
||||
return dict(code=-1, message="Please provide a URL/address.")
|
||||
|
||||
feed = Feed.get_feed_from_url(address, create=False, aggressive=True, offset=offset)
|
||||
|
||||
logging.user(request.user, "~FBFinding feed (search_feed): %s" % address)
|
||||
ip = request.META.get('HTTP_X_FORWARDED_FOR', None) or request.META['REMOTE_ADDR']
|
||||
logging.user(request.user, "~FBIP: %s" % ip)
|
||||
aggressive = request.user.is_authenticated()
|
||||
feed = Feed.get_feed_from_url(address, create=False, aggressive=aggressive, offset=offset)
|
||||
if feed:
|
||||
return feed.canonical()
|
||||
else:
|
||||
|
@ -150,7 +155,7 @@ def feed_autocomplete(request):
|
|||
else:
|
||||
return feeds
|
||||
|
||||
@ratelimit(minutes=1, requests=10)
|
||||
@ratelimit(minutes=1, requests=30)
|
||||
@json.json_view
|
||||
def load_feed_statistics(request, feed_id):
|
||||
user = get_user(request)
|
||||
|
@ -193,7 +198,21 @@ def load_feed_statistics(request, feed_id):
|
|||
# Stories per month - average and month-by-month breakout
|
||||
average_stories_per_month, story_count_history = feed.average_stories_per_month, feed.data.story_count_history
|
||||
stats['average_stories_per_month'] = average_stories_per_month
|
||||
stats['story_count_history'] = story_count_history and json.decode(story_count_history)
|
||||
story_count_history = story_count_history and json.decode(story_count_history)
|
||||
if story_count_history and isinstance(story_count_history, dict):
|
||||
stats['story_count_history'] = story_count_history['months']
|
||||
stats['story_days_history'] = story_count_history['days']
|
||||
stats['story_hours_history'] = story_count_history['hours']
|
||||
else:
|
||||
stats['story_count_history'] = story_count_history
|
||||
|
||||
# Rotate hours to match user's timezone offset
|
||||
localoffset = timezone.utcoffset(datetime.datetime.utcnow())
|
||||
hours_offset = int(localoffset.total_seconds() / 3600)
|
||||
rotated_hours = {}
|
||||
for hour, value in stats['story_hours_history'].items():
|
||||
rotated_hours[str(int(hour)+hours_offset)] = value
|
||||
stats['story_hours_history'] = rotated_hours
|
||||
|
||||
# Subscribers
|
||||
stats['subscriber_count'] = feed.num_subscribers
|
||||
|
@ -231,7 +250,8 @@ def load_feed_settings(request, feed_id):
|
|||
stats['duplicate_addresses'] = feed.duplicate_addresses.all()
|
||||
|
||||
return stats
|
||||
|
||||
|
||||
@ratelimit(minutes=10, requests=10)
|
||||
@json.json_view
|
||||
def exception_retry(request):
|
||||
user = get_user(request)
|
||||
|
@ -296,9 +316,9 @@ def exception_change_feed_address(request):
|
|||
timezone = request.user.profile.timezone
|
||||
code = -1
|
||||
|
||||
if not feed.known_good and (feed.has_page_exception or feed.has_feed_exception):
|
||||
if False and (feed.has_page_exception or feed.has_feed_exception):
|
||||
# Fix broken feed
|
||||
logging.user(request, "~FRFixing feed exception by address: ~SB%s~SN to ~SB%s" % (feed.feed_address, feed_address))
|
||||
logging.user(request, "~FRFixing feed exception by address: %s - ~SB%s~SN to ~SB%s" % (feed, feed.feed_address, feed_address))
|
||||
feed.has_feed_exception = False
|
||||
feed.active = True
|
||||
feed.fetched_once = False
|
||||
|
@ -317,7 +337,10 @@ def exception_change_feed_address(request):
|
|||
else:
|
||||
# Branch good feed
|
||||
logging.user(request, "~FRBranching feed by address: ~SB%s~SN to ~SB%s" % (feed.feed_address, feed_address))
|
||||
feed, _ = Feed.objects.get_or_create(feed_address=feed_address, feed_link=feed.feed_link)
|
||||
try:
|
||||
feed = Feed.objects.get(hash_address_and_link=Feed.generate_hash_address_and_link(feed_address, feed.feed_link))
|
||||
except Feed.DoesNotExist:
|
||||
feed = Feed.objects.create(feed_address=feed_address, feed_link=feed.feed_link)
|
||||
code = 1
|
||||
if feed.pk != original_feed.pk:
|
||||
try:
|
||||
|
@ -377,17 +400,17 @@ def exception_change_feed_link(request):
|
|||
timezone = request.user.profile.timezone
|
||||
code = -1
|
||||
|
||||
if not feed.known_good and (feed.has_page_exception or feed.has_feed_exception):
|
||||
if False and (feed.has_page_exception or feed.has_feed_exception):
|
||||
# Fix broken feed
|
||||
logging.user(request, "~FRFixing feed exception by link: ~SB%s~SN to ~SB%s" % (feed.feed_link, feed_link))
|
||||
feed_address = feedfinder.feed(feed_link)
|
||||
if feed_address:
|
||||
found_feed_urls = feedfinder.find_feeds(feed_link)
|
||||
if len(found_feed_urls):
|
||||
code = 1
|
||||
feed.has_page_exception = False
|
||||
feed.active = True
|
||||
feed.fetched_once = False
|
||||
feed.feed_link = feed_link
|
||||
feed.feed_address = feed_address
|
||||
feed.feed_address = found_feed_urls[0]
|
||||
duplicate_feed = feed.schedule_feed_fetch_immediately()
|
||||
if duplicate_feed:
|
||||
new_feed = Feed.objects.get(pk=duplicate_feed.pk)
|
||||
|
@ -399,7 +422,10 @@ def exception_change_feed_link(request):
|
|||
else:
|
||||
# Branch good feed
|
||||
logging.user(request, "~FRBranching feed by link: ~SB%s~SN to ~SB%s" % (feed.feed_link, feed_link))
|
||||
feed, _ = Feed.objects.get_or_create(feed_address=feed.feed_address, feed_link=feed_link)
|
||||
try:
|
||||
feed = Feed.objects.get(hash_address_and_link=Feed.generate_hash_address_and_link(feed.feed_address, feed_link))
|
||||
except Feed.DoesNotExist:
|
||||
feed = Feed.objects.create(feed_address=feed.feed_address, feed_link=feed_link)
|
||||
code = 1
|
||||
if feed.pk != original_feed.pk:
|
||||
try:
|
||||
|
@ -505,3 +531,18 @@ def original_story(request):
|
|||
original_page = story.fetch_original_page(force=force, request=request, debug=debug)
|
||||
|
||||
return HttpResponse(original_page or "")
|
||||
|
||||
@required_params('story_hash')
|
||||
@json.json_view
|
||||
def story_changes(request):
|
||||
story_hash = request.REQUEST.get('story_hash', None)
|
||||
show_changes = is_true(request.REQUEST.get('show_changes', True))
|
||||
story, _ = MStory.find_story(story_hash=story_hash)
|
||||
if not story:
|
||||
logging.user(request, "~FYFetching ~FGoriginal~FY story page: ~FRstory not found")
|
||||
return {'code': -1, 'message': 'Story not found.', 'original_page': None, 'failed': True}
|
||||
|
||||
return {
|
||||
'story': Feed.format_story(story, show_changes=show_changes)
|
||||
}
|
||||
|
|
@ -1,3 +1,4 @@
|
|||
import re
|
||||
import time
|
||||
import datetime
|
||||
import pymongo
|
||||
|
@ -23,7 +24,6 @@ class MUserSearch(mongo.Document):
|
|||
meta = {
|
||||
'collection': 'user_search',
|
||||
'indexes': ['user_id'],
|
||||
'index_drop_dups': True,
|
||||
'allow_inheritance': False,
|
||||
}
|
||||
|
||||
|
@ -270,7 +270,8 @@ class SearchStory:
|
|||
def query(cls, feed_ids, query, order, offset, limit):
|
||||
cls.create_elasticsearch_mapping()
|
||||
cls.ES.indices.refresh()
|
||||
|
||||
|
||||
query = re.sub(r'([^\s\w_\-])+', ' ', query) # Strip non-alphanumeric
|
||||
sort = "date:desc" if order == "newest" else "date:asc"
|
||||
string_q = pyes.query.QueryStringQuery(query, default_operator="AND")
|
||||
feed_q = pyes.query.TermsQuery('feed_id', feed_ids[:1000])
|
||||
|
@ -285,8 +286,41 @@ class SearchStory:
|
|||
logging.info(" ---> ~FG~SNSearch ~FCstories~FG for: ~SB%s~SN (across %s feed%s)" %
|
||||
(query, len(feed_ids), 's' if len(feed_ids) != 1 else ''))
|
||||
|
||||
return [r.get_id() for r in results]
|
||||
try:
|
||||
result_ids = [r.get_id() for r in results]
|
||||
except pyes.InvalidQuery, e:
|
||||
logging.info(" ---> ~FRInvalid search query \"%s\": %s" % (query, e))
|
||||
return []
|
||||
|
||||
return result_ids
|
||||
|
||||
@classmethod
|
||||
def global_query(cls, query, order, offset, limit):
|
||||
cls.create_elasticsearch_mapping()
|
||||
cls.ES.indices.refresh()
|
||||
|
||||
query = re.sub(r'([^\s\w_\-])+', ' ', query) # Strip non-alphanumeric
|
||||
sort = "date:desc" if order == "newest" else "date:asc"
|
||||
string_q = pyes.query.QueryStringQuery(query, default_operator="AND")
|
||||
q = pyes.query.BoolQuery(must=[string_q])
|
||||
try:
|
||||
results = cls.ES.search(q, indices=cls.index_name(), doc_types=[cls.type_name()],
|
||||
partial_fields={}, sort=sort, start=offset, size=limit)
|
||||
except pyes.exceptions.NoServerAvailable:
|
||||
logging.debug(" ***> ~FRNo search server available.")
|
||||
return []
|
||||
|
||||
logging.info(" ---> ~FG~SNSearch ~FCstories~FG for: ~SB%s~SN (across all feeds)" %
|
||||
(query))
|
||||
|
||||
try:
|
||||
result_ids = [r.get_id() for r in results]
|
||||
except pyes.InvalidQuery, e:
|
||||
logging.info(" ---> ~FRInvalid search query \"%s\": %s" % (query, e))
|
||||
return []
|
||||
|
||||
return result_ids
|
||||
|
||||
|
||||
class SearchFeed:
|
||||
|
||||
|
|
|
@ -8,6 +8,7 @@ import mongoengine as mongo
|
|||
import random
|
||||
import requests
|
||||
import HTMLParser
|
||||
import tweepy
|
||||
from collections import defaultdict
|
||||
from BeautifulSoup import BeautifulSoup
|
||||
from mongoengine.queryset import Q
|
||||
|
@ -26,7 +27,6 @@ from apps.rss_feeds.text_importer import TextImporter
|
|||
from apps.rss_feeds.page_importer import PageImporter
|
||||
from apps.profile.models import Profile, MSentEmail
|
||||
from vendor import facebook
|
||||
from vendor import tweepy
|
||||
from vendor import appdotnet
|
||||
from vendor import pynliner
|
||||
from utils import log as logging
|
||||
|
@ -133,6 +133,8 @@ class MSocialProfile(mongo.Document):
|
|||
stories_last_month = mongo.IntField(default=0)
|
||||
average_stories_per_month = mongo.IntField(default=0)
|
||||
story_count_history = mongo.ListField()
|
||||
story_days_history = mongo.DictField()
|
||||
story_hours_history = mongo.DictField()
|
||||
feed_classifier_counts = mongo.DictField()
|
||||
favicon_color = mongo.StringField(max_length=6)
|
||||
protected = mongo.BooleanField()
|
||||
|
@ -142,7 +144,6 @@ class MSocialProfile(mongo.Document):
|
|||
'collection': 'social_profile',
|
||||
'indexes': ['user_id', 'following_user_ids', 'follower_user_ids', 'unfollowed_user_ids', 'requested_follow_user_ids'],
|
||||
'allow_inheritance': False,
|
||||
'index_drop_dups': True,
|
||||
}
|
||||
|
||||
def __unicode__(self):
|
||||
|
@ -690,23 +691,25 @@ class MSocialProfile(mongo.Document):
|
|||
map_f = """
|
||||
function() {
|
||||
var date = (this.shared_date.getFullYear()) + "-" + (this.shared_date.getMonth()+1);
|
||||
emit(date, 1);
|
||||
var hour = this.shared_date.getHours();
|
||||
var day = this.shared_date.getDay();
|
||||
emit(this.story_hash, {'month': date, 'hour': hour, 'day': day});
|
||||
}
|
||||
"""
|
||||
reduce_f = """
|
||||
function(key, values) {
|
||||
var total = 0;
|
||||
for (var i=0; i < values.length; i++) {
|
||||
total += values[i];
|
||||
}
|
||||
return total;
|
||||
return values;
|
||||
}
|
||||
"""
|
||||
dates = {}
|
||||
res = MSharedStory.objects(user_id=self.user_id).map_reduce(map_f, reduce_f, output='inline')
|
||||
for r in res:
|
||||
dates[r.key] = r.value
|
||||
year = int(re.findall(r"(\d{4})-\d{1,2}", r.key)[0])
|
||||
dates = defaultdict(int)
|
||||
hours = defaultdict(int)
|
||||
days = defaultdict(int)
|
||||
results = MSharedStory.objects(user_id=self.user_id).map_reduce(map_f, reduce_f, output='inline')
|
||||
for result in results:
|
||||
dates[result.value['month']] += 1
|
||||
hours[str(int(result.value['hour']))] += 1
|
||||
days[str(int(result.value['day']))] += 1
|
||||
year = int(re.findall(r"(\d{4})-\d{1,2}", result.value['month'])[0])
|
||||
if year < min_year:
|
||||
min_year = year
|
||||
|
||||
|
@ -725,6 +728,8 @@ class MSocialProfile(mongo.Document):
|
|||
month_count += 1
|
||||
|
||||
self.story_count_history = months
|
||||
self.story_days_history = days
|
||||
self.story_hours_history = hours
|
||||
self.average_stories_per_month = total / max(1, month_count)
|
||||
self.save()
|
||||
|
||||
|
@ -1395,10 +1400,11 @@ class MCommentReply(mongo.EmbeddedDocument):
|
|||
'ordering': ['publish_date'],
|
||||
'id_field': 'reply_id',
|
||||
'allow_inheritance': False,
|
||||
'strict': False,
|
||||
}
|
||||
|
||||
|
||||
class MSharedStory(mongo.Document):
|
||||
class MSharedStory(mongo.DynamicDocument):
|
||||
user_id = mongo.IntField()
|
||||
shared_date = mongo.DateTimeField()
|
||||
comments = mongo.StringField()
|
||||
|
@ -1435,9 +1441,9 @@ class MSharedStory(mongo.Document):
|
|||
'collection': 'shared_stories',
|
||||
'indexes': [('user_id', '-shared_date'), ('user_id', 'story_feed_id'),
|
||||
'shared_date', 'story_guid', 'story_feed_id', 'story_hash'],
|
||||
'index_drop_dups': True,
|
||||
'ordering': ['-shared_date'],
|
||||
'allow_inheritance': False,
|
||||
'strict': False,
|
||||
}
|
||||
|
||||
def __unicode__(self):
|
||||
|
@ -1481,7 +1487,7 @@ class MSharedStory(mongo.Document):
|
|||
self.story_original_content_z = zlib.compress(self.story_original_content)
|
||||
self.story_original_content = None
|
||||
|
||||
self.story_guid_hash = hashlib.sha1(self.story_guid).hexdigest()[:6]
|
||||
self.story_guid_hash = self.guid_hash
|
||||
self.story_title = strip_tags(self.story_title)
|
||||
self.story_hash = self.feed_guid_hash
|
||||
|
||||
|
@ -1527,7 +1533,7 @@ class MSharedStory(mongo.Document):
|
|||
},
|
||||
}])
|
||||
month_ago = datetime.datetime.now() - datetime.timedelta(days=days)
|
||||
user_ids = stats['result']
|
||||
user_ids = list(stats)
|
||||
user_ids = sorted(user_ids, key=lambda x:x['stories'], reverse=True)
|
||||
print " ---> Found %s users with more than %s starred stories" % (len(user_ids), stories)
|
||||
|
||||
|
@ -1910,7 +1916,10 @@ class MSharedStory(mongo.Document):
|
|||
'story_hash': story['story_hash'],
|
||||
'user_id__in': sharer_user_ids,
|
||||
}
|
||||
shared_stories = cls.objects.filter(**params)
|
||||
if params.has_key('story_db_id'):
|
||||
params.pop('story_db_id')
|
||||
shared_stories = cls.objects.filter(**params)\
|
||||
.hint([('story_hash', 1)])
|
||||
for shared_story in shared_stories:
|
||||
comments = shared_story.comments_with_author()
|
||||
story['reply_count'] += len(comments['replies'])
|
||||
|
@ -1958,7 +1967,8 @@ class MSharedStory(mongo.Document):
|
|||
'story_hash': story['story_hash'],
|
||||
'user_id__in': story['shared_by_friends'],
|
||||
}
|
||||
shared_stories = cls.objects.filter(**params)
|
||||
shared_stories = cls.objects.filter(**params)\
|
||||
.hint([('story_hash', 1)])
|
||||
for shared_story in shared_stories:
|
||||
comments = shared_story.comments_with_author()
|
||||
story['reply_count'] += len(comments['replies'])
|
||||
|
@ -2045,7 +2055,7 @@ class MSharedStory(mongo.Document):
|
|||
return "%sstory/%s/%s" % (
|
||||
profile.blurblog_url,
|
||||
slugify(self.story_title)[:20],
|
||||
self.guid_hash[:6]
|
||||
self.story_hash
|
||||
)
|
||||
|
||||
def generate_post_to_service_message(self, truncate=None, include_url=True):
|
||||
|
@ -2446,13 +2456,17 @@ class MSocialServices(mongo.Document):
|
|||
logging.user(user, "~BG~FMTwitter import starting...")
|
||||
|
||||
api = self.twitter_api()
|
||||
try:
|
||||
twitter_user = api.me()
|
||||
except tweepy.TweepError, e:
|
||||
api = None
|
||||
|
||||
if not api:
|
||||
logging.user(user, "~BG~FMTwitter import ~SBfailed~SN: no api access.")
|
||||
self.syncing_twitter = False
|
||||
self.save()
|
||||
return
|
||||
|
||||
twitter_user = api.me()
|
||||
|
||||
self.twitter_picture_url = twitter_user.profile_image_url_https
|
||||
self.twitter_username = twitter_user.screen_name
|
||||
self.twitter_refreshed_date = datetime.datetime.utcnow()
|
||||
|
@ -2747,7 +2761,8 @@ class MSocialServices(mongo.Document):
|
|||
api = self.twitter_api()
|
||||
api.update_status(status=message)
|
||||
except tweepy.TweepError, e:
|
||||
print e
|
||||
user = User.objects.get(pk=self.user_id)
|
||||
logging.user(user, "~FRTwitter error: ~SB%s" % e)
|
||||
return
|
||||
|
||||
return True
|
||||
|
@ -2806,7 +2821,6 @@ class MInteraction(mongo.Document):
|
|||
'collection': 'interactions',
|
||||
'indexes': [('user_id', '-date'), 'category', 'with_user_id'],
|
||||
'allow_inheritance': False,
|
||||
'index_drop_dups': True,
|
||||
'ordering': ['-date'],
|
||||
}
|
||||
|
||||
|
@ -2833,6 +2847,24 @@ class MInteraction(mongo.Document):
|
|||
'story_hash': story_hash,
|
||||
}
|
||||
|
||||
@classmethod
|
||||
def trim(cls, user_id, limit=100):
|
||||
user = User.objects.get(pk=user_id)
|
||||
interactions = cls.objects.filter(user_id=user_id).skip(limit)
|
||||
interaction_count = interactions.count(True)
|
||||
|
||||
if interaction_count == 0:
|
||||
interaction_count = cls.objects.filter(user_id=user_id).count()
|
||||
logging.user(user, "~FBNot trimming interactions, only ~SB%s~SN interactions found" % interaction_count)
|
||||
return
|
||||
|
||||
logging.user(user, "~FBTrimming ~SB%s~SN interactions..." % interaction_count)
|
||||
|
||||
for interaction in interactions:
|
||||
interaction.delete()
|
||||
|
||||
logging.user(user, "~FBDone trimming ~SB%s~SN interactions" % interaction_count)
|
||||
|
||||
@classmethod
|
||||
def publish_update_to_subscribers(self, user_id):
|
||||
user = User.objects.get(pk=user_id)
|
||||
|
@ -3048,7 +3080,6 @@ class MActivity(mongo.Document):
|
|||
'collection': 'activities',
|
||||
'indexes': [('user_id', '-date'), 'category', 'with_user_id'],
|
||||
'allow_inheritance': False,
|
||||
'index_drop_dups': True,
|
||||
'ordering': ['-date'],
|
||||
}
|
||||
|
||||
|
@ -3073,6 +3104,24 @@ class MActivity(mongo.Document):
|
|||
'content_id': self.content_id,
|
||||
'story_hash': story_hash,
|
||||
}
|
||||
|
||||
@classmethod
|
||||
def trim(cls, user_id, limit=100):
|
||||
user = User.objects.get(pk=user_id)
|
||||
activities = cls.objects.filter(user_id=user_id).skip(limit)
|
||||
activity_count = activities.count(True)
|
||||
|
||||
if activity_count == 0:
|
||||
activity_count = cls.objects.filter(user_id=user_id).count()
|
||||
logging.user(user, "~FBNot trimming activities, only ~SB%s~SN activities found" % activity_count)
|
||||
return
|
||||
|
||||
logging.user(user, "~FBTrimming ~SB%s~SN activities..." % activity_count)
|
||||
|
||||
for activity in activities:
|
||||
activity.delete()
|
||||
|
||||
logging.user(user, "~FBDone trimming ~SB%s~SN activities" % activity_count)
|
||||
|
||||
@classmethod
|
||||
def user(cls, user_id, page=1, limit=4, public=False, categories=None):
|
||||
|
@ -3291,7 +3340,6 @@ class MFollowRequest(mongo.Document):
|
|||
'indexes': ['follower_user_id', 'followee_user_id'],
|
||||
'ordering': ['-date'],
|
||||
'allow_inheritance': False,
|
||||
'index_drop_dups': True,
|
||||
}
|
||||
|
||||
@classmethod
|
||||
|
|
|
@ -47,7 +47,8 @@ def load_social_stories(request, user_id, username=None):
|
|||
page = request.REQUEST.get('page')
|
||||
order = request.REQUEST.get('order', 'newest')
|
||||
read_filter = request.REQUEST.get('read_filter', 'all')
|
||||
query = request.REQUEST.get('query')
|
||||
query = request.REQUEST.get('query', '').strip()
|
||||
include_story_content = is_true(request.REQUEST.get('include_story_content', True))
|
||||
stories = []
|
||||
message = None
|
||||
|
||||
|
@ -115,6 +116,7 @@ def load_social_stories(request, user_id, username=None):
|
|||
.only('story_hash', 'starred_date', 'user_tags')
|
||||
shared_stories = MSharedStory.objects(user_id=user.pk,
|
||||
story_hash__in=story_hashes)\
|
||||
.hint([('story_hash', 1)])\
|
||||
.only('story_hash', 'shared_date', 'comments')
|
||||
starred_stories = dict([(story.story_hash, dict(starred_date=story.starred_date,
|
||||
user_tags=story.user_tags))
|
||||
|
@ -125,6 +127,8 @@ def load_social_stories(request, user_id, username=None):
|
|||
|
||||
nowtz = localtime_for_timezone(now, user.profile.timezone)
|
||||
for story in stories:
|
||||
if not include_story_content:
|
||||
del story['story_content']
|
||||
story['social_user_id'] = social_user_id
|
||||
# story_date = localtime_for_timezone(story['story_date'], user.profile.timezone)
|
||||
shared_date = localtime_for_timezone(story['shared_date'], user.profile.timezone)
|
||||
|
@ -252,6 +256,7 @@ def load_river_blurblog(request):
|
|||
for story in starred_stories])
|
||||
shared_stories = MSharedStory.objects(user_id=user.pk,
|
||||
story_hash__in=story_hashes)\
|
||||
.hint([('story_hash', 1)])\
|
||||
.only('story_hash', 'shared_date', 'comments')
|
||||
shared_stories = dict([(story.story_hash, dict(shared_date=story.shared_date,
|
||||
comments=story.comments))
|
||||
|
@ -394,7 +399,8 @@ def load_social_page(request, user_id, username=None, **kwargs):
|
|||
params = dict(user_id=social_user.pk)
|
||||
if feed_id:
|
||||
params['story_feed_id'] = feed_id
|
||||
|
||||
if params.has_key('story_db_id'):
|
||||
params.pop('story_db_id')
|
||||
mstories = MSharedStory.objects(**params).order_by('-shared_date')[offset:offset+limit+1]
|
||||
stories = Feed.format_stories(mstories, include_permalinks=True)
|
||||
|
||||
|
@ -433,7 +439,8 @@ def load_social_page(request, user_id, username=None, **kwargs):
|
|||
for story in stories:
|
||||
if user.pk in story['share_user_ids']:
|
||||
story['shared_by_user'] = True
|
||||
shared_story = MSharedStory.objects.get(user_id=user.pk,
|
||||
shared_story = MSharedStory.objects.hint([('story_hash', 1)])\
|
||||
.get(user_id=user.pk,
|
||||
story_feed_id=story['story_feed_id'],
|
||||
story_hash=story['story_hash'])
|
||||
story['user_comments'] = shared_story.comments
|
||||
|
@ -449,7 +456,9 @@ def load_social_page(request, user_id, username=None, **kwargs):
|
|||
social_services = MSocialServices.get_user(social_user.pk)
|
||||
|
||||
active_story_db = MSharedStory.objects.filter(user_id=social_user.pk,
|
||||
story_guid_hash=story_id).limit(1)
|
||||
story_hash=story_id)\
|
||||
.hint([('story_hash', 1)])\
|
||||
.limit(1)
|
||||
if active_story_db:
|
||||
active_story_db = active_story_db[0]
|
||||
if user_social_profile.bb_permalink_direct:
|
||||
|
@ -551,6 +560,13 @@ def mark_story_as_shared(request):
|
|||
'message': 'Could not find the original story and no copies could be found.'
|
||||
})
|
||||
|
||||
feed = Feed.get_by_id(feed_id)
|
||||
if feed and feed.is_newsletter:
|
||||
return json.json_response(request, {
|
||||
'code': -1,
|
||||
'message': 'You cannot share newsletters. Somebody could unsubscribe you!'
|
||||
})
|
||||
|
||||
if not request.user.profile.is_premium and MSharedStory.feed_quota(request.user.pk, feed_id, story.story_hash):
|
||||
return json.json_response(request, {
|
||||
'code': -1,
|
||||
|
@ -558,7 +574,9 @@ def mark_story_as_shared(request):
|
|||
})
|
||||
shared_story = MSharedStory.objects.filter(user_id=request.user.pk,
|
||||
story_feed_id=feed_id,
|
||||
story_hash=story['story_hash']).limit(1).first()
|
||||
story_hash=story['story_hash'])\
|
||||
.hint([('story_hash', 1)])\
|
||||
.limit(1).first()
|
||||
if not shared_story:
|
||||
story_db = {
|
||||
"story_guid": story.story_guid,
|
||||
|
@ -566,7 +584,7 @@ def mark_story_as_shared(request):
|
|||
"story_permalink": story.story_permalink,
|
||||
"story_title": story.story_title,
|
||||
"story_feed_id": story.story_feed_id,
|
||||
"story_content_z": story.story_content_z,
|
||||
"story_content_z": getattr(story, 'story_latest_content_z', None) or story.story_content_z,
|
||||
"story_author_name": story.story_author_name,
|
||||
"story_tags": story.story_tags,
|
||||
"story_date": story.story_date,
|
||||
|
@ -1007,7 +1025,7 @@ def load_follow_requests(request):
|
|||
'request_profiles': request_profiles,
|
||||
}
|
||||
|
||||
@ratelimit(minutes=1, requests=10)
|
||||
@ratelimit(minutes=1, requests=100)
|
||||
@json.json_view
|
||||
def load_user_friends(request):
|
||||
user = get_user(request.user)
|
||||
|
@ -1358,6 +1376,8 @@ def load_social_statistics(request, social_user_id, username=None):
|
|||
# Stories per month - average and month-by-month breakout
|
||||
stats['average_stories_per_month'] = social_profile.average_stories_per_month
|
||||
stats['story_count_history'] = social_profile.story_count_history
|
||||
stats['story_hours_history'] = social_profile.story_hours_history
|
||||
stats['story_days_history'] = social_profile.story_days_history
|
||||
|
||||
# Subscribers
|
||||
stats['subscriber_count'] = social_profile.follower_count
|
||||
|
|
|
@ -61,6 +61,7 @@ javascripts:
|
|||
- media/js/vendor/tag-it.js
|
||||
- media/js/vendor/chart.js
|
||||
- media/js/vendor/audio.js
|
||||
- media/js/vendor/push.js
|
||||
- media/js/vendor/socket.io-client.*.js
|
||||
- media/js/vendor/inflector.js
|
||||
- media/js/vendor/underscore-*.js
|
||||
|
@ -69,6 +70,7 @@ javascripts:
|
|||
- media/js/vendor/bootstrap-transition.js
|
||||
- media/js/vendor/highlight.js
|
||||
- media/js/vendor/fitvid.js
|
||||
- media/js/vendor/imagesLoaded-*.js
|
||||
- media/js/newsblur/reader/reader_utils.js
|
||||
- media/js/newsblur/reader/reader.js
|
||||
- media/js/newsblur/reader/reader_popover.js
|
||||
|
|
13
clients/android/NewsBlur/.gitignore
vendored
|
@ -1,12 +1,8 @@
|
|||
build.xml
|
||||
custom_rules.xml
|
||||
local.properties
|
||||
*.java.swo
|
||||
*.java.swp
|
||||
*.xml.swo
|
||||
*.xml.swp
|
||||
*.css.swo
|
||||
*.css.swp
|
||||
.*.swo
|
||||
.*.swp
|
||||
*.iml
|
||||
out/
|
||||
bin/
|
||||
|
@ -16,3 +12,8 @@ libs/ActionBarSherlock/
|
|||
.settings/
|
||||
.classpath
|
||||
.project
|
||||
.gradle/
|
||||
app/
|
||||
gradle/
|
||||
*.gradle
|
||||
!build.gradle
|
||||
|
|
|
@ -1,11 +1,11 @@
|
|||
<?xml version="1.0" encoding="utf-8"?>
|
||||
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
|
||||
package="com.newsblur"
|
||||
android:versionCode="113"
|
||||
android:versionName="4.6.1" >
|
||||
android:versionCode="131"
|
||||
android:versionName="5.0.0b2" >
|
||||
|
||||
<uses-sdk
|
||||
android:minSdkVersion="14"
|
||||
android:minSdkVersion="16"
|
||||
android:targetSdkVersion="23" />
|
||||
|
||||
<uses-permission android:name="android.permission.INTERNET" />
|
||||
|
@ -24,6 +24,7 @@
|
|||
<activity
|
||||
android:name=".activity.InitActivity"
|
||||
android:label="@string/newsblur"
|
||||
android:theme="@style/initStyle"
|
||||
android:noHistory="true">
|
||||
<intent-filter>
|
||||
<category android:name="android.intent.category.LAUNCHER" />
|
||||
|
@ -140,12 +141,6 @@
|
|||
|
||||
<receiver android:name=".service.ServiceScheduleReceiver" />
|
||||
|
||||
<receiver android:name=".service.NetStateReceiver">
|
||||
<intent-filter>
|
||||
<action android:name="android.net.conn.CONNECTIVITY_CHANGE" />
|
||||
</intent-filter>
|
||||
</receiver>
|
||||
|
||||
</application>
|
||||
|
||||
</manifest>
|
||||
|
|
74
clients/android/NewsBlur/BUILDING.md
Normal file
|
@ -0,0 +1,74 @@
|
|||
# Building the NewsBlur Android App
|
||||
|
||||
The NewsBlur Android application should build with virtually any supported Android build tool or environement. The file structure found in this repo has been chosen for maximum compatibility with various development setups. Several examples of how to build can be found below.
|
||||
|
||||
It is the goal of this repository to stay agnostic to build environments or tools. Please consider augmenting the .gitignore file to catch any developer-specific build artifacts or environment configuration you may discover while building.
|
||||
|
||||
## How to Build from the Command Line with Ant
|
||||
|
||||
*an abridged version of the official guide found [here](https://developer.android.com/tools/building/building-cmdline.html)*
|
||||
|
||||
*this type of build will use the vendored dependencies in `clients/android/NewsBlur/libs`*
|
||||
|
||||
1. install java and ant (prefer official JDK over OpenJDK)
|
||||
2. download the Android SDK from [android.com](https://developer.android.com/sdk/index.html)
|
||||
3. get the `tools/` and/or `platform-tools/` directories ifrom the SDK on your path
|
||||
4. `android update sdk --no-ui` (this could take a while; you can use the --filter option to just get the SDK, platform tools, and support libs)
|
||||
5. go to the clients/android/ NewsBlur directory and run `android update project --name NewsBlur --path .`
|
||||
6. build a test APK with `ant clean && ant debug` (.apk will be in `/bin` under the working directory)
|
||||
|
||||
## How to Build from the Command Line with Gradle
|
||||
|
||||
*this type of build will pull dependencies as prescribed in the gradle configuration*
|
||||
|
||||
1. install gradle v2.8 or better
|
||||
2. build a test APK with `gradle build` (.apk will be in `/build/outputs/apk/` under the working directory)
|
||||
|
||||
## How to Build from Android Studio
|
||||
|
||||
*this type of build will pull dependencies as prescribed in the gradle configuration*
|
||||
|
||||
1. install and fully update [Android Studio](http://developer.android.com/tools/studio/index.html)
|
||||
2. run AS and choose `import project`
|
||||
3. within your local copy of this repo, select the directory/path where this file is located
|
||||
4. select `OK` to let AS manage Gradle for your project
|
||||
6. select `Build -> Make Project from the menu`
|
||||
7. select `Build -> Build APK from the menu`
|
||||
|
||||
## Building Releases
|
||||
|
||||
*tip: a debug-compatible release key is usually located at `~/.android/debug.keystore` with the alias `androiddebugkey` and the passwords `android`.*
|
||||
|
||||
### Ant Builds
|
||||
|
||||
* Create a `local.properties` file with the following values:
|
||||
|
||||
```
|
||||
has.keystore=true
|
||||
key.store=<path to your keystore file>
|
||||
key.alias=<alias of the key with which you would like to sign the APK>
|
||||
```
|
||||
|
||||
* run `ant clean && ant release`
|
||||
|
||||
### Gradle Builds
|
||||
|
||||
* Add the following lines to the `android` section of the `build.gradle` file:
|
||||
|
||||
```
|
||||
signingConfigs {
|
||||
release {
|
||||
storeFile file('<absolute path to your keystore file>')
|
||||
keyAlias '<alias of the key with which you would like to sign the APK>'
|
||||
storePassword '<keystore password>'
|
||||
keyPassword '<key password>'
|
||||
}
|
||||
}
|
||||
buildTypes.release.signingConfig = signingConfigs.release
|
||||
```
|
||||
|
||||
* run `gradle assembleRelease`
|
||||
|
||||
### Android Studio Builds
|
||||
|
||||
* See the AS documentation on [signing release builds](http://developer.android.com/tools/publishing/app-signing.html#studio)
|
|
@ -1,10 +0,0 @@
|
|||
## How To Build from the Command Line
|
||||
|
||||
*an abridged version of the official guide found [here](https://developer.android.com/tools/building/building-cmdline.html)*
|
||||
|
||||
1. install java and ant (prefer official JDK over OpenJDK)
|
||||
2. download the Android SDK from [android.com](https://developer.android.com/sdk/index.html)
|
||||
3. get the `tools/` and/or `platform-tools/` directories ifrom the SDK on your path
|
||||
4. `android update sdk --no-ui` (this could take a while; you can use the --filter option to just get the SDK, platform tools, and support libs)
|
||||
5. go to the clients/android/ NewsBlur directory and run `android update project --name NewsBlur --path .`
|
||||
6. build a test APK with `ant clean && ant debug` (.apk will be in `/bin` under the working directory)
|
|
@ -1,22 +1,28 @@
|
|||
body, span {
|
||||
body, span, table, td, p, div, li {
|
||||
background-color: #1A1A1A !important;
|
||||
color: #FFF !important;
|
||||
}
|
||||
|
||||
a, a span {
|
||||
color: #319DC5 !important;
|
||||
background-color: #1A1A1A !important;
|
||||
}
|
||||
|
||||
a:visited, a:visited span {
|
||||
color: #319DC5 !important;
|
||||
background-color: #1A1A1A !important;
|
||||
}
|
||||
|
||||
code {
|
||||
pre, blockquote, code {
|
||||
background-color: #4C4C4C;
|
||||
}
|
||||
|
||||
pre, blockquote {
|
||||
background-color: #4C4C4C;
|
||||
blockquote *, pre *, code * {
|
||||
background-color: #4C4C4C !important;
|
||||
}
|
||||
|
||||
blockquote a, blockquote a span {
|
||||
background-color: #4C4C4C !important;
|
||||
}
|
||||
|
||||
.NB-story > table {
|
||||
|
|
|
@ -21,13 +21,14 @@ p, div, table {
|
|||
line-height: 1.5em;
|
||||
}
|
||||
|
||||
p, a, table, video, embed, object, iframe, div, figure, dl, dt {
|
||||
p, a, table, video, embed, object, iframe, div, figure, dl, dt, center {
|
||||
/* virtually all story content wants to be the wrong size for our viewport, so try to fit them horizontally, as we
|
||||
can only scroll in vertical. however, do not auto-set height for these, as they tend to ratio down to something
|
||||
tiny before dynamic content is done loading. these types will resize well, but exclude images, which distort. */
|
||||
width: auto !important;
|
||||
max-width: none !important;
|
||||
margin: 0px !important;
|
||||
min-width: 0px !important;
|
||||
}
|
||||
|
||||
img {
|
||||
|
|
50
clients/android/NewsBlur/build.gradle
Normal file
|
@ -0,0 +1,50 @@
|
|||
buildscript {
|
||||
repositories {
|
||||
mavenCentral()
|
||||
jcenter()
|
||||
}
|
||||
dependencies {
|
||||
classpath 'com.android.tools.build:gradle:2.2.2'
|
||||
}
|
||||
}
|
||||
|
||||
repositories {
|
||||
mavenCentral()
|
||||
}
|
||||
|
||||
apply plugin: 'com.android.application'
|
||||
|
||||
dependencies {
|
||||
compile 'com.android.support:support-v13:19.1.0'
|
||||
compile 'com.jakewharton:butterknife:7.0.1'
|
||||
compile 'com.squareup.okhttp3:okhttp:3.4.1'
|
||||
compile 'com.google.code.gson:gson:2.6.2'
|
||||
}
|
||||
|
||||
android {
|
||||
compileSdkVersion 23
|
||||
buildToolsVersion '23.0.2'
|
||||
sourceSets {
|
||||
main {
|
||||
manifest.srcFile 'AndroidManifest.xml'
|
||||
java.srcDirs = ['src']
|
||||
res.srcDirs = ['res']
|
||||
assets.srcDirs = ['assets']
|
||||
}
|
||||
}
|
||||
lintOptions {
|
||||
abortOnError false
|
||||
}
|
||||
buildTypes {
|
||||
debug {
|
||||
minifyEnabled false
|
||||
shrinkResources false
|
||||
}
|
||||
release {
|
||||
minifyEnabled true
|
||||
shrinkResources true
|
||||
proguardFiles getDefaultProguardFile('proguard-android.txt'), 'proguard-project.txt'
|
||||
}
|
||||
}
|
||||
}
|
||||
|
1
clients/android/NewsBlur/libs/README.LIBS.md
Normal file
|
@ -0,0 +1 @@
|
|||
*Note: the libraries provided here are vendored for ease of building with ant, eclipse, etc. if any dependency versions are changed here, they should also be changed in the build.gradle file.*
|
BIN
clients/android/NewsBlur/libs/butterknife-7.0.1.jar
Normal file
BIN
clients/android/NewsBlur/libs/gson-2.6.2.jar
Normal file
BIN
clients/android/NewsBlur/libs/okhttp-3.4.1.jar
Normal file
BIN
clients/android/NewsBlur/libs/okio-1.9.0.jar
Normal file
|
@ -4,4 +4,7 @@
|
|||
<issue id="RtlHardcoded" severity="ignore" />
|
||||
<issue id="RtlSymmetry" severity="ignore" />
|
||||
<issue id="IconLocation" severity="ignore" />
|
||||
<issue id="InvalidPackage">
|
||||
<ignore regexp="okio-.*.jar" />
|
||||
</issue>
|
||||
</lint>
|
||||
|
|
27
clients/android/NewsBlur/proguard-project.txt
Normal file
|
@ -0,0 +1,27 @@
|
|||
-dontobfuscate
|
||||
|
||||
-keepattributes Exceptions,InnerClasses,Signature
|
||||
-keepattributes *Annotation*
|
||||
|
||||
-dontwarn okio.**
|
||||
-dontnote okio.**
|
||||
-keep class okhttp3.** { *; }
|
||||
-keep interface okhttp3.** { *; }
|
||||
-dontwarn okhttp3.**
|
||||
-dontnote okhttp3.**
|
||||
|
||||
-keep class butterknife.** { *; }
|
||||
-dontwarn butterknife.internal.**
|
||||
-keep class **$$ViewBinder { *; }
|
||||
-keepclasseswithmembernames class * {
|
||||
@butterknife.* <fields>;
|
||||
}
|
||||
-keepclasseswithmembernames class * {
|
||||
@butterknife.* <methods>;
|
||||
}
|
||||
|
||||
# we use proguard only as an APK shrinker and many of our dependencies throw
|
||||
# all manner of gross warnings. kept silent by default, the following lines
|
||||
# can be commented out to help diagnose shrinkage errors.
|
||||
-dontwarn **
|
||||
-dontnote **
|
|
@ -1,12 +1,2 @@
|
|||
# This file is automatically generated by Android Tools.
|
||||
# Do not modify this file -- YOUR CHANGES WILL BE ERASED!
|
||||
#
|
||||
# This file must be checked in Version Control Systems.
|
||||
#
|
||||
# To customize properties used by the Ant build system edit
|
||||
# "ant.properties", and override values to adapt the script to your
|
||||
# project structure.
|
||||
#
|
||||
|
||||
# Project target.
|
||||
target=android-23
|
||||
proguard.config=${sdk.dir}/tools/proguard/proguard-android.txt:proguard-project.txt
|
||||
|
|
|
@ -1,7 +0,0 @@
|
|||
<?xml version="1.0" encoding="utf-8"?>
|
||||
<selector xmlns:android="http://schemas.android.com/apk/res/android">
|
||||
<item android:state_pressed="true" android:color="@color/darkgray" />
|
||||
<item android:state_enabled="false" android:color="@color/darkgray" />
|
||||
<item android:state_focused="true" android:color="@color/midgray" />
|
||||
<item android:color="@color/midgray" />
|
||||
</selector>
|
Before Width: | Height: | Size: 1.7 KiB |
Before Width: | Height: | Size: 3.7 KiB |
Before Width: | Height: | Size: 1.4 KiB |
Before Width: | Height: | Size: 1.4 KiB |
Before Width: | Height: | Size: 1.5 KiB |
Before Width: | Height: | Size: 1.5 KiB |
Before Width: | Height: | Size: 1.7 KiB |
Before Width: | Height: | Size: 1.4 KiB |
Before Width: | Height: | Size: 748 B |
Before Width: | Height: | Size: 1.2 KiB |
Before Width: | Height: | Size: 1.2 KiB |
Before Width: | Height: | Size: 1.3 KiB |
Before Width: | Height: | Size: 1.3 KiB |
Before Width: | Height: | Size: 1.4 KiB |
|
@ -0,0 +1,11 @@
|
|||
<?xml version="1.0" encoding="utf-8"?>
|
||||
<layer-list xmlns:android="http://schemas.android.com/apk/res/android">
|
||||
|
||||
<item>
|
||||
<shape android:shape="rectangle">
|
||||
<corners android:radius="2dp" />
|
||||
<solid android:color="@color/col_button_background" />
|
||||
</shape>
|
||||
</item>
|
||||
|
||||
</layer-list>
|
|
@ -0,0 +1,11 @@
|
|||
<?xml version="1.0" encoding="utf-8"?>
|
||||
<layer-list xmlns:android="http://schemas.android.com/apk/res/android">
|
||||
|
||||
<item>
|
||||
<shape android:shape="rectangle">
|
||||
<corners android:radius="2dp" />
|
||||
<solid android:color="@color/col_button_background_dark" />
|
||||
</shape>
|
||||
</item>
|
||||
|
||||
</layer-list>
|
|
@ -0,0 +1,22 @@
|
|||
<?xml version="1.0" encoding="utf-8"?>
|
||||
<layer-list xmlns:android="http://schemas.android.com/apk/res/android">
|
||||
|
||||
<item>
|
||||
<shape android:shape="rectangle">
|
||||
<corners
|
||||
android:bottomRightRadius="2dp"
|
||||
android:bottomLeftRadius="20dp"
|
||||
android:topLeftRadius="20dp"
|
||||
android:topRightRadius="2dp"/>
|
||||
<solid android:color="@color/col_button_background_dark" />
|
||||
</shape>
|
||||
</item>
|
||||
|
||||
<item
|
||||
android:drawable="@drawable/ic_arrow_back_gray46"
|
||||
android:bottom="12dp"
|
||||
android:left="12dp"
|
||||
android:right="22dp"
|
||||
android:top="12dp"/>
|
||||
|
||||
</layer-list>
|
|
@ -0,0 +1,22 @@
|
|||
<?xml version="1.0" encoding="utf-8"?>
|
||||
<layer-list xmlns:android="http://schemas.android.com/apk/res/android">
|
||||
|
||||
<item>
|
||||
<shape android:shape="rectangle">
|
||||
<corners
|
||||
android:bottomRightRadius="2dp"
|
||||
android:bottomLeftRadius="20dp"
|
||||
android:topLeftRadius="20dp"
|
||||
android:topRightRadius="2dp"/>
|
||||
<solid android:color="@color/col_button_background_dark" />
|
||||
</shape>
|
||||
</item>
|
||||
|
||||
<item
|
||||
android:drawable="@drawable/ic_arrow_back_gray74"
|
||||
android:bottom="12dp"
|
||||
android:left="12dp"
|
||||
android:right="22dp"
|
||||
android:top="12dp"/>
|
||||
|
||||
</layer-list>
|
|
@ -0,0 +1,22 @@
|
|||
<?xml version="1.0" encoding="utf-8"?>
|
||||
<layer-list xmlns:android="http://schemas.android.com/apk/res/android">
|
||||
|
||||
<item>
|
||||
<shape android:shape="rectangle">
|
||||
<corners
|
||||
android:bottomRightRadius="2dp"
|
||||
android:bottomLeftRadius="20dp"
|
||||
android:topLeftRadius="20dp"
|
||||
android:topRightRadius="2dp"/>
|
||||
<solid android:color="@color/col_button_background_pressed_dark" />
|
||||
</shape>
|
||||
</item>
|
||||
|
||||
<item
|
||||
android:drawable="@drawable/ic_arrow_back_gray74"
|
||||
android:bottom="12dp"
|
||||
android:left="12dp"
|
||||
android:right="22dp"
|
||||
android:top="12dp"/>
|
||||
|
||||
</layer-list>
|
|
@ -0,0 +1,22 @@
|
|||
<?xml version="1.0" encoding="utf-8"?>
|
||||
<layer-list xmlns:android="http://schemas.android.com/apk/res/android">
|
||||
|
||||
<item>
|
||||
<shape android:shape="rectangle">
|
||||
<corners
|
||||
android:bottomRightRadius="20dp"
|
||||
android:bottomLeftRadius="2dp"
|
||||
android:topLeftRadius="2dp"
|
||||
android:topRightRadius="20dp"/>
|
||||
<solid android:color="@color/gray30" />
|
||||
</shape>
|
||||
</item>
|
||||
|
||||
<item
|
||||
android:drawable="@drawable/ic_check_circle_gray74"
|
||||
android:bottom="12dp"
|
||||
android:left="92dp"
|
||||
android:right="12dp"
|
||||
android:top="12dp"/>
|
||||
|
||||
</layer-list>
|
|
@ -0,0 +1,22 @@
|
|||
<?xml version="1.0" encoding="utf-8"?>
|
||||
<layer-list xmlns:android="http://schemas.android.com/apk/res/android">
|
||||
|
||||
<item>
|
||||
<shape android:shape="rectangle">
|
||||
<corners
|
||||
android:bottomRightRadius="20dp"
|
||||
android:bottomLeftRadius="2dp"
|
||||
android:topLeftRadius="2dp"
|
||||
android:topRightRadius="20dp"/>
|
||||
<solid android:color="@color/gray46" />
|
||||
</shape>
|
||||
</item>
|
||||
|
||||
<item
|
||||
android:drawable="@drawable/ic_check_circle_gray74"
|
||||
android:bottom="12dp"
|
||||
android:left="92dp"
|
||||
android:right="12dp"
|
||||
android:top="12dp"/>
|
||||
|
||||
</layer-list>
|
|
@ -0,0 +1,22 @@
|
|||
<?xml version="1.0" encoding="utf-8"?>
|
||||
<layer-list xmlns:android="http://schemas.android.com/apk/res/android">
|
||||
|
||||
<item>
|
||||
<shape android:shape="rectangle">
|
||||
<corners
|
||||
android:bottomRightRadius="20dp"
|
||||
android:bottomLeftRadius="2dp"
|
||||
android:topLeftRadius="2dp"
|
||||
android:topRightRadius="20dp"/>
|
||||
<solid android:color="@color/col_button_background_dark" />
|
||||
</shape>
|
||||
</item>
|
||||
|
||||
<item
|
||||
android:drawable="@drawable/ic_arrow_forward_gray74"
|
||||
android:bottom="12dp"
|
||||
android:left="92dp"
|
||||
android:right="12dp"
|
||||
android:top="12dp"/>
|
||||
|
||||
</layer-list>
|
|
@ -0,0 +1,22 @@
|
|||
<?xml version="1.0" encoding="utf-8"?>
|
||||
<layer-list xmlns:android="http://schemas.android.com/apk/res/android">
|
||||
|
||||
<item>
|
||||
<shape android:shape="rectangle">
|
||||
<corners
|
||||
android:bottomRightRadius="20dp"
|
||||
android:bottomLeftRadius="2dp"
|
||||
android:topLeftRadius="2dp"
|
||||
android:topRightRadius="20dp"/>
|
||||
<solid android:color="@color/col_button_background_pressed_dark" />
|
||||
</shape>
|
||||
</item>
|
||||
|
||||
<item
|
||||
android:drawable="@drawable/ic_arrow_forward_gray74"
|
||||
android:bottom="12dp"
|
||||
android:left="92dp"
|
||||
android:right="12dp"
|
||||
android:top="12dp"/>
|
||||
|
||||
</layer-list>
|
|
@ -0,0 +1,22 @@
|
|||
<?xml version="1.0" encoding="utf-8"?>
|
||||
<layer-list xmlns:android="http://schemas.android.com/apk/res/android">
|
||||
|
||||
<item>
|
||||
<shape android:shape="rectangle">
|
||||
<corners
|
||||
android:bottomRightRadius="20dp"
|
||||
android:bottomLeftRadius="2dp"
|
||||
android:topLeftRadius="2dp"
|
||||
android:topRightRadius="20dp"/>
|
||||
<solid android:color="@color/col_button_background_dark" />
|
||||
</shape>
|
||||
</item>
|
||||
|
||||
<item
|
||||
android:drawable="@drawable/ic_send_gray74"
|
||||
android:bottom="12dp"
|
||||
android:left="18dp"
|
||||
android:right="14dp"
|
||||
android:top="12dp"/>
|
||||
|
||||
</layer-list>
|
|
@ -0,0 +1,22 @@
|
|||
<?xml version="1.0" encoding="utf-8"?>
|
||||
<layer-list xmlns:android="http://schemas.android.com/apk/res/android">
|
||||
|
||||
<item>
|
||||
<shape android:shape="rectangle">
|
||||
<corners
|
||||
android:bottomRightRadius="20dp"
|
||||
android:bottomLeftRadius="2dp"
|
||||
android:topLeftRadius="2dp"
|
||||
android:topRightRadius="20dp"/>
|
||||
<solid android:color="@color/col_button_background_pressed_dark" />
|
||||
</shape>
|
||||
</item>
|
||||
|
||||
<item
|
||||
android:drawable="@drawable/ic_send_gray74"
|
||||
android:bottom="12dp"
|
||||
android:left="18dp"
|
||||
android:right="14dp"
|
||||
android:top="12dp"/>
|
||||
|
||||
</layer-list>
|
|
@ -0,0 +1,22 @@
|
|||
<?xml version="1.0" encoding="utf-8"?>
|
||||
<layer-list xmlns:android="http://schemas.android.com/apk/res/android">
|
||||
|
||||
<item>
|
||||
<shape android:shape="rectangle">
|
||||
<corners
|
||||
android:bottomRightRadius="2dp"
|
||||
android:bottomLeftRadius="20dp"
|
||||
android:topLeftRadius="20dp"
|
||||
android:topRightRadius="2dp"/>
|
||||
<solid android:color="@color/col_button_background_dark" />
|
||||
</shape>
|
||||
</item>
|
||||
|
||||
<item
|
||||
android:drawable="@drawable/ic_story_feed_gray74"
|
||||
android:bottom="12dp"
|
||||
android:left="12dp"
|
||||
android:right="82dp"
|
||||
android:top="12dp"/>
|
||||
|
||||
</layer-list>
|
|
@ -0,0 +1,22 @@
|
|||
<?xml version="1.0" encoding="utf-8"?>
|
||||
<layer-list xmlns:android="http://schemas.android.com/apk/res/android">
|
||||
|
||||
<item>
|
||||
<shape android:shape="rectangle">
|
||||
<corners
|
||||
android:bottomRightRadius="2dp"
|
||||
android:bottomLeftRadius="20dp"
|
||||
android:topLeftRadius="20dp"
|
||||
android:topRightRadius="2dp"/>
|
||||
<solid android:color="@color/col_button_background_pressed_dark" />
|
||||
</shape>
|
||||
</item>
|
||||
|
||||
<item
|
||||
android:drawable="@drawable/ic_story_feed_gray74"
|
||||
android:bottom="12dp"
|
||||
android:left="12dp"
|
||||
android:right="82dp"
|
||||
android:top="12dp"/>
|
||||
|
||||
</layer-list>
|
|
@ -0,0 +1,22 @@
|
|||
<?xml version="1.0" encoding="utf-8"?>
|
||||
<layer-list xmlns:android="http://schemas.android.com/apk/res/android">
|
||||
|
||||
<item>
|
||||
<shape android:shape="rectangle">
|
||||
<corners
|
||||
android:bottomRightRadius="2dp"
|
||||
android:bottomLeftRadius="20dp"
|
||||
android:topLeftRadius="20dp"
|
||||
android:topRightRadius="2dp"/>
|
||||
<solid android:color="@color/col_button_background_dark" />
|
||||
</shape>
|
||||
</item>
|
||||
|
||||
<item
|
||||
android:drawable="@drawable/ic_story_text_gray74"
|
||||
android:bottom="12dp"
|
||||
android:left="12dp"
|
||||
android:right="82dp"
|
||||
android:top="12dp"/>
|
||||
|
||||
</layer-list>
|
|
@ -0,0 +1,22 @@
|
|||
<?xml version="1.0" encoding="utf-8"?>
|
||||
<layer-list xmlns:android="http://schemas.android.com/apk/res/android">
|
||||
|
||||
<item>
|
||||
<shape android:shape="rectangle">
|
||||
<corners
|
||||
android:bottomRightRadius="2dp"
|
||||
android:bottomLeftRadius="20dp"
|
||||
android:topLeftRadius="20dp"
|
||||
android:topRightRadius="2dp"/>
|
||||
<solid android:color="@color/col_button_background_pressed_dark" />
|
||||
</shape>
|
||||
</item>
|
||||
|
||||
<item
|
||||
android:drawable="@drawable/ic_story_text_gray74"
|
||||
android:bottom="12dp"
|
||||
android:left="12dp"
|
||||
android:right="82dp"
|
||||
android:top="12dp"/>
|
||||
|
||||
</layer-list>
|
|
@ -0,0 +1,22 @@
|
|||
<?xml version="1.0" encoding="utf-8"?>
|
||||
<layer-list xmlns:android="http://schemas.android.com/apk/res/android">
|
||||
|
||||
<item>
|
||||
<shape android:shape="rectangle">
|
||||
<corners
|
||||
android:bottomRightRadius="2dp"
|
||||
android:bottomLeftRadius="20dp"
|
||||
android:topLeftRadius="20dp"
|
||||
android:topRightRadius="2dp"/>
|
||||
<solid android:color="@color/col_button_background" />
|
||||
</shape>
|
||||
</item>
|
||||
|
||||
<item
|
||||
android:drawable="@drawable/ic_arrow_back_gray74"
|
||||
android:bottom="12dp"
|
||||
android:left="12dp"
|
||||
android:right="22dp"
|
||||
android:top="12dp"/>
|
||||
|
||||
</layer-list>
|
|
@ -0,0 +1,22 @@
|
|||
<?xml version="1.0" encoding="utf-8"?>
|
||||
<layer-list xmlns:android="http://schemas.android.com/apk/res/android">
|
||||
|
||||
<item>
|
||||
<shape android:shape="rectangle">
|
||||
<corners
|
||||
android:bottomRightRadius="2dp"
|
||||
android:bottomLeftRadius="20dp"
|
||||
android:topLeftRadius="20dp"
|
||||
android:topRightRadius="2dp"/>
|
||||
<solid android:color="@color/col_button_background" />
|
||||
</shape>
|
||||
</item>
|
||||
|
||||
<item
|
||||
android:drawable="@drawable/ic_arrow_back_gray46"
|
||||
android:bottom="12dp"
|
||||
android:left="12dp"
|
||||
android:right="22dp"
|
||||
android:top="12dp"/>
|
||||
|
||||
</layer-list>
|
|
@ -0,0 +1,22 @@
|
|||
<?xml version="1.0" encoding="utf-8"?>
|
||||
<layer-list xmlns:android="http://schemas.android.com/apk/res/android">
|
||||
|
||||
<item>
|
||||
<shape android:shape="rectangle">
|
||||
<corners
|
||||
android:bottomRightRadius="2dp"
|
||||
android:bottomLeftRadius="20dp"
|
||||
android:topLeftRadius="20dp"
|
||||
android:topRightRadius="2dp"/>
|
||||
<solid android:color="@color/col_button_background_pressed" />
|
||||
</shape>
|
||||
</item>
|
||||
|
||||
<item
|
||||
android:drawable="@drawable/ic_arrow_back_gray46"
|
||||
android:bottom="12dp"
|
||||
android:left="12dp"
|
||||
android:right="22dp"
|
||||
android:top="12dp"/>
|
||||
|
||||
</layer-list>
|
|
@ -0,0 +1,22 @@
|
|||
<?xml version="1.0" encoding="utf-8"?>
|
||||
<layer-list xmlns:android="http://schemas.android.com/apk/res/android">
|
||||
|
||||
<item>
|
||||
<shape android:shape="rectangle">
|
||||
<corners
|
||||
android:bottomRightRadius="20dp"
|
||||
android:bottomLeftRadius="2dp"
|
||||
android:topLeftRadius="2dp"
|
||||
android:topRightRadius="20dp"/>
|
||||
<solid android:color="@color/gray75" />
|
||||
</shape>
|
||||
</item>
|
||||
|
||||
<item
|
||||
android:drawable="@drawable/ic_check_circle_gray46"
|
||||
android:bottom="12dp"
|
||||
android:left="92dp"
|
||||
android:right="12dp"
|
||||
android:top="12dp"/>
|
||||
|
||||
</layer-list>
|
|
@ -0,0 +1,22 @@
|
|||
<?xml version="1.0" encoding="utf-8"?>
|
||||
<layer-list xmlns:android="http://schemas.android.com/apk/res/android">
|
||||
|
||||
<item>
|
||||
<shape android:shape="rectangle">
|
||||
<corners
|
||||
android:bottomRightRadius="20dp"
|
||||
android:bottomLeftRadius="2dp"
|
||||
android:topLeftRadius="2dp"
|
||||
android:topRightRadius="20dp"/>
|
||||
<solid android:color="@color/col_button_background_pressed" />
|
||||
</shape>
|
||||
</item>
|
||||
|
||||
<item
|
||||
android:drawable="@drawable/ic_check_circle_gray46"
|
||||
android:bottom="12dp"
|
||||
android:left="92dp"
|
||||
android:right="12dp"
|
||||
android:top="12dp"/>
|
||||
|
||||
</layer-list>
|
|
@ -0,0 +1,22 @@
|
|||
<?xml version="1.0" encoding="utf-8"?>
|
||||
<layer-list xmlns:android="http://schemas.android.com/apk/res/android">
|
||||
|
||||
<item>
|
||||
<shape android:shape="rectangle">
|
||||
<corners
|
||||
android:bottomRightRadius="20dp"
|
||||
android:bottomLeftRadius="2dp"
|
||||
android:topLeftRadius="2dp"
|
||||
android:topRightRadius="20dp"/>
|
||||
<solid android:color="@color/col_button_background" />
|
||||
</shape>
|
||||
</item>
|
||||
|
||||
<item
|
||||
android:drawable="@drawable/ic_arrow_forward_gray46"
|
||||
android:bottom="12dp"
|
||||
android:left="92dp"
|
||||
android:right="12dp"
|
||||
android:top="12dp"/>
|
||||
|
||||
</layer-list>
|
|
@ -0,0 +1,22 @@
|
|||
<?xml version="1.0" encoding="utf-8"?>
|
||||
<layer-list xmlns:android="http://schemas.android.com/apk/res/android">
|
||||
|
||||
<item>
|
||||
<shape android:shape="rectangle">
|
||||
<corners
|
||||
android:bottomRightRadius="20dp"
|
||||
android:bottomLeftRadius="2dp"
|
||||
android:topLeftRadius="2dp"
|
||||
android:topRightRadius="20dp"/>
|
||||
<solid android:color="@color/col_button_background_pressed" />
|
||||
</shape>
|
||||
</item>
|
||||
|
||||
<item
|
||||
android:drawable="@drawable/ic_arrow_forward_gray46"
|
||||
android:bottom="12dp"
|
||||
android:left="92dp"
|
||||
android:right="12dp"
|
||||
android:top="12dp"/>
|
||||
|
||||
</layer-list>
|
|
@ -0,0 +1,22 @@
|
|||
<?xml version="1.0" encoding="utf-8"?>
|
||||
<layer-list xmlns:android="http://schemas.android.com/apk/res/android">
|
||||
|
||||
<item>
|
||||
<shape android:shape="rectangle">
|
||||
<corners
|
||||
android:bottomRightRadius="20dp"
|
||||
android:bottomLeftRadius="2dp"
|
||||
android:topLeftRadius="2dp"
|
||||
android:topRightRadius="20dp"/>
|
||||
<solid android:color="@color/col_button_background" />
|
||||
</shape>
|
||||
</item>
|
||||
|
||||
<item
|
||||
android:drawable="@drawable/ic_send_gray46"
|
||||
android:bottom="12dp"
|
||||
android:left="18dp"
|
||||
android:right="14dp"
|
||||
android:top="12dp"/>
|
||||
|
||||
</layer-list>
|