Skip to content

Commit

Permalink
fixed conflicts
Browse files Browse the repository at this point in the history
  • Loading branch information
tiero committed Nov 30, 2017
2 parents b9f41ef + b0625ed commit daa731e
Show file tree
Hide file tree
Showing 48 changed files with 27,004 additions and 128 deletions.
16 changes: 8 additions & 8 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -111,9 +111,9 @@ ontology.


### NetworkABC ###
The base class for block-chain netwoks. NetworkABC defines the protocol for
The base class for block-chain networks. NetworkABC defines the protocol for
managing the interactions of Agents, Ontology, ServiceDescriptors, as well as
Agent discovery, and negotion. Each block-chain implementation will require a
Agent discovery, and negotiation. Each block-chain implementation will require a
separate NetworkABC subclass which implements the smart-contracts and communication
protocols required to implement the Network ABC API.

Expand All @@ -140,7 +140,7 @@ Additionally, ServiceAdapterABC subclasses may also implement:
* **`init`** - perform service one-time initialization
* **`start`** - connect with external network providers required to perform service
* **`stop`** - disconnect in preparation for taking the service offline
* **`can_perform`** - override to implment service specific logic
* **`can_perform`** - override to implement service specific logic
* **`all_required_agents_can_perform`** - check if dependent agents can perform
sub-services

Expand All @@ -158,14 +158,14 @@ a client with underlying workers.

### Prerequisites ###

At this time, the only OS that this has been tested on is Ubunut 16.04 LTS. This
At this time, the only OS that this has been tested on is Ubuntu 16.04 LTS. This
may change in the future but for now, you must start there. There are only a
few system level requirement.
few system level requirements.

Docker and Docker Compose are used heavily. You must have a recent version of
Docker installed.

The current demo uses a 3 node setup, Alice, Bob and Charlie.
The current demo uses a 3-node setup, Alice, Bob and Charlie.

The following command will create and run the Alice node.

Expand All @@ -189,7 +189,7 @@ In yet another separate terminal, you can run the Charlie agent.
### Installing ###

The install process can take a bit of time. If you run into any issue, please
do not hesitate to file a but report. Be sure to include the last few lines of
do not hesitate to file a bug report. Be sure to include the last few lines of
the console output to help us determine where it failed.

You will not need sudo for the install as long as the items in the prerequisites
Expand All @@ -215,7 +215,7 @@ Tests are handled by PyTest via Tox

Docs are not currently included in the source as they are changing rapidly. We
do suggest you create the docs and look them over. Once this settles, we will
likely have a online reference to these.
likely have an online reference to these.

```
./tools.sh docs
Expand Down
1 change: 1 addition & 0 deletions agent/adapters/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
# Adapters module initialization goes here...
122 changes: 102 additions & 20 deletions agent/adapters/aigents/__init__.py
Original file line number Diff line number Diff line change
@@ -1,42 +1,39 @@
#
# agent/adapters/aigents/__init__.py - adapter integrating different sub-services of Aigents web service,
# such as RSS feeding, social graph discovery, summarizing pattern axtraction and entity attribution
# such as RSS feeding, social graph discovery, summarizing text extraction by pattern and entity attribution
#
# Copyright (c) 2017 SingularityNET
#
# Distributed under the MIT software license, see LICENSE file.
#

import urllib.parse
import requests
import logging
from typing import List

from adapters.aigents.settings import AigentsSettings
from sn_agent.job.job_descriptor import JobDescriptor
from sn_agent.service_adapter import ServiceAdapterABC
from sn_agent.ontology import Service
from sn_agent.service_adapter import ServiceAdapterABC, ServiceManager

logger = logging.getLogger(__name__)


class AigentsAdapter(ServiceAdapterABC):
type_name = "AigentsAdapter"


def __init__(self, app, service: Service, required_services: List[Service]) -> None:
super().__init__(app, service, required_services)

# Initialize member variables heres.
self.response_template = None
self.settings = AigentsSettings()

def post_load_initialize(self, service_manager: ServiceManager):
super().post_load_initialize(service_manager)

# Do any agent initialization here.
# TODO
# TODO login to Aigents here to work in one session? But then need to RSS working even if session is active and handle expired sessions as well!
pass


def get_attached_job_data(self, job_item: dict) -> dict:

# Make sure the input type is one we can handle...
Expand All @@ -52,6 +49,37 @@ def get_attached_job_data(self, job_item: dict) -> dict:

return input_data

def request(self,session,request):
url = self.settings.AIGENTS_PATH+"?"+request
logger.info(url)
r = session.post(url)
if r is None or r.status_code != 200:
raise RuntimeError("Aigents - no response")
logger.info(r.text)
return r

def validate(self,data,key):
if not key in data or len(data[key]) < 1:
raise RuntimeError("Aigents - no input data "+key)
return data[key]

def create_session(self):
session = requests.session()
# TODO login in one query, if/when possible
url = self.settings.AIGENTS_PATH+"?my email "+self.settings.AIGENTS_LOGIN_EMAIL+"."
logger.info(url)
r = session.post(url);
logger.info(r.text)
url = self.settings.AIGENTS_PATH+"?"+urllib.parse.quote_plus("my "+self.settings.AIGENTS_SECRET_QUESTION+" "+self.settings.AIGENTS_SECRET_ANSWER+".")
logger.info(url)
r = session.post(url)
logger.info(r.text)
# set language
url = self.settings.AIGENTS_PATH+"?my language english."
logger.info(url)
r = session.post(url);
logger.info(r.text)
return session

def perform(self, job: JobDescriptor):
logger.debug("Performing Aigents job.")
Expand All @@ -61,31 +89,85 @@ def perform(self, job: JobDescriptor):
for job_item in job:

# Get the input data for this job.
#TODO actual parameters handling
job_data = self.get_attached_job_data(job_item)
logger.info(job_data)
#job_params = job_data['params']['job_params']
#logger.info('Aigents input'+job_params)
rss_area = job_data['rss_area']

#TODO config
r = requests.get("https://aigents.com/al/?rss%20"+rss_area)
logger.info(r)
if not 'data' in job_data:
raise RuntimeError("Aigents - no input data")

if r is None:
r = self.aigents_perform(job_data['data'])
if r is None or r.status_code != 200:
raise RuntimeError("Aigents - no response")

output = r.text

# Add the job results to our combined results array for all job items.
single_job_result = {
'adapter_type' : 'aigents',
'service_type' : 'rss',
'response_data': output
'service_type' : job_data["type"], # TODO cleanup, based on service request and response ontology discussion?
'response_data': r.text
}
results.append(single_job_result)

# Return the list of results that come from appending the results for the
# individual job items in the job.
return results

# Placeholder or virtual method for child override
def aigents_perform(self,data):
return None

class AigentsTextsClustererAdapter(AigentsAdapter):
type_name = "AigentsTextsClustererAdapter"

def aigents_perform(self,data):
texts = self.validate(data,"texts")
s = self.create_session()
r = self.request(s,"You cluster format json texts '"+texts+"'!")
return r

class AigentsTextExtractorAdapter(AigentsAdapter):
type_name = "AigentsTextExtractorAdapter"

def aigents_perform(self,data):
pattern = self.validate(data,"pattern")
text = self.validate(data,"text")
s = self.create_session()
# TODO cleanup and streamline, make json in http header 'Accept': 'application/json'
# specify format
self.request(s,"peer has format.")
self.request(s,"my format json.")
# set user context
self.request(s,"my knows '"+pattern+"', trusts '"+pattern+"'.")
self.request(s,"my sites '"+pattern+"', trusts '"+text+"'.")
self.request(s,"is '"+pattern+"' new false.")
self.request(s,"no there is '"+pattern+"'.")
# do extraction and request data
self.request(s,"You reading '"+pattern+"' in '"+text+"'!")
r = self.request(s,"what is '"+pattern+"' text, about, context?")
# clear user context
self.request(s,"my knows no '"+pattern+"', trusts no '"+pattern+"'.")
self.request(s,"my sites no '"+pattern+"', trusts no '"+text+"'.")
self.request(s,"my format not json.")
return r

class AigentsRSSFeederAdapter(AigentsAdapter):
type_name = "AigentsRSSFeederAdapter"

def aigents_perform(self,data):
area = self.validate(data,"area")
# sessionless request
r = requests.post(self.settings.AIGENTS_PATH+"?rss%20"+area)
logger.info(r)
return r

class AigentsSocialGrapherAdapter(AigentsAdapter):
type_name = "AigentsSocialGrapherAdapter"

def aigents_perform(self,data):
network = self.validate(data,"network")
userid = self.validate(data,"userid")
days = self.validate(data,"period")
s = self.create_session()
url = self.settings.AIGENTS_PATH+"?"+network+' id '+userid+' report, period '+days+', format json, authorities, fans, similar to me'
r = self.request(s,url)
return r

17 changes: 17 additions & 0 deletions agent/adapters/aigents/settings.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
import os
from pathlib import Path

from sn_agent import SettingsBase

DEMO_PARENT_DIR = Path(__file__).parent.parent.parent


class AigentsSettings(SettingsBase):
def __init__(self, **custom_settings):
self._ENV_PREFIX = 'SN_AIGENTS_'
self.AIGENTS_PATH = 'https://aigents.com/al/'
self.AIGENTS_LOGIN_EMAIL = '[email protected]'
self.AIGENTS_SECRET_QUESTION = '2+2*2'
self.AIGENTS_SECRET_ANSWER = 'six'

super().__init__(**custom_settings)
1 change: 1 addition & 0 deletions agent/adapters/opencog/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
# adapters/opencog module initialization goes here...
129 changes: 129 additions & 0 deletions agent/adapters/opencog/relex/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,129 @@
#
# adapters/opencog/relex/__init__.py - an AI adapter that integrates the relex natural language parser...
#
# Copyright (c) 2017 SingularityNET
#
# Distributed under the MIT software license, see LICENSE file.
#

import logging
from typing import List
import socket
import json
import select
import asyncio

from sn_agent.job.job_descriptor import JobDescriptor
from sn_agent.service_adapter import ServiceAdapterABC
from sn_agent.ontology import Service
from sn_agent.service_adapter import ServiceAdapterABC, ServiceManager

logger = logging.getLogger(__name__)


class RelexAdapter(ServiceAdapterABC):
type_name = "RelexAdapter"


def __init__(self, app, service: Service, required_services: List[Service]) -> None:
super().__init__(app, service, required_services)

# Initialize member variables heres.
self.response_template = None

def post_load_initialize(self, service_manager: ServiceManager):
super().post_load_initialize(service_manager)

def get_attached_job_data(self, job_item: dict) -> dict:

# Make sure the input type is one we can handle...
input_type = job_item['input_type']
if input_type != 'attached':
logger.error("BAD input dict %s", str(job_item))
raise RuntimeError("AgentSimple - job item 'input_type' must be 'attached'.")

# Pull the input data from the job item
input_data = job_item['input_data']
if input_data is None:
raise RuntimeError("AgentSimple - job item 'input_data' must be defined.")

return input_data

def relex_parse_sentence(self, sentence: str) -> dict:

# Open a TCP socket
relex_socket = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
time_out_seconds = 10.0
relex_socket.settimeout(time_out_seconds)
received_message = "NOT RECEIVED"

try:
# Connect to server and send data - note that "relex" below is the way to get to the
# server running in another Docker container. See: docker_compose.yml
relex_socket.connect(("relex", 9000))

# Construct the message for the relex server. NOTE: It expects a "text: " at the
# beginning and a "\n" at the end.
relex_sentence = "text: " + sentence + "\n"

# Send the sentence to the relex server.
relex_socket.sendall(relex_sentence.encode('utf-8'))

# Read the first parts
received_message = relex_socket.recv(1024)

# Strip off the length from the message
if b'\n' in received_message:
length_string, received_message = received_message.split(b'\n', 1)
bytes = int(length_string) - len(length_string)

# Read the rest if we don't already have the full JSON reply.
if bytes > 1024:
received_message = received_message + relex_socket.recv(bytes)

# Decode this since the rest of the system expects unicode strings and not the
# bytes returned from the socket.
received_message = received_message.decode('utf-8')

except socket.timeout:
print("Socket timed out")

finally:
relex_socket.close()

return received_message


def perform(self, job: JobDescriptor):
logger.debug("Performing Relex parse job.")

# Process the items in the job. The job may include many different sentences.
results = []
for job_item in job:

# Get the input data for this job.
job_data = self.get_attached_job_data(job_item)

# Check to make sure you have the data required.
sentence = job_data.get('sentence')
if sentence is None:
raise RuntimeError("RelexAdapter - job item 'input_data' missing 'sentence'")

# Send the sentence to the relex server for parsing.
parsed_sentence = self.relex_parse_sentence(sentence)

# Add the job results to our combined results array for all job items.
single_job_result = {
'relex_parse': parsed_sentence,
}
results.append(single_job_result)

# Return the list of results that come from appending the results for the
# individual job items in the job.
return results






1 change: 1 addition & 0 deletions agent/adapters/tensorflow/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
# adapters/tensorflow module initialization goes here...
Loading

0 comments on commit daa731e

Please sign in to comment.