GSOC brainstorming.

Previous Topic Next Topic
 
classic Classic list List threaded Threaded
7 messages Options
Reply | Threaded
Open this post in threaded view
|

GSOC brainstorming.

Hasanat Kazmi
Hello List,
Please let me introduce myself, I am Hasanat Kazmi, I started and
failed to monetize drop-box like service much before dropbox came on
the scene. I also worked on rsync in my undergrad final year project
and wrote a research paper (though didn't publish it after graduating)
I also participated in GSOC in 2009 for Sahana, I also applied for
owncloud in GSOC in 2010 but unfortunately didnt get.

I have been silently observing owncloud mailing list and wiki for last
year. I have noticed that pace of development of owncloud is really
slow.

Looking at the Technology Brainstorming page at wiki, I think I can be
best utilized by working on P2P sync between server through NAT as I
have prior experience of p2p syncing in Sahana. I was just wondering
that how much you guys (specially Frank Karlitschek) are interested in
this? I also couldn't find any information on wiki about progress (if
any) on this.

Moreover, what are some other areas where you guys would like work
during this year's SOC?

Hasanat Kazmi
+923464362473

Reply | Threaded
Open this post in threaded view
|

Re: GSOC brainstorming.

Frank Karlitschek

On 26.01.2011, at 17:42, Hasanat Kazmi wrote:

> Hello List,
> Please let me introduce myself, I am Hasanat Kazmi, I started and
> failed to monetize drop-box like service much before dropbox came on
> the scene. I also worked on rsync in my undergrad final year project
> and wrote a research paper (though didn't publish it after graduating)
> I also participated in GSOC in 2009 for Sahana, I also applied for
> owncloud in GSOC in 2010 but unfortunately didnt get.
>
> I have been silently observing owncloud mailing list and wiki for last
> year. I have noticed that pace of development of owncloud is really
> slow.

I?m not sure I agree. If you look at the git commits you see a lot of activity. :-)

>
> Looking at the Technology Brainstorming page at wiki, I think I can be
> best utilized by working on P2P sync between server through NAT as I
> have prior experience of p2p syncing in Sahana. I was just wondering
> that how much you guys (specially Frank Karlitschek) are interested in
> this? I also couldn't find any information on wiki about progress (if
> any) on this.

This sounds interesting. But I don?t fully understand what you mean. It would be great if you could a more details description if you suggestion to this list.

>
> Moreover, what are some other areas where you guys would like work
> during this year's SOC?

We have no decision if and what we do for GSOC this year.


Do you have suggestions?  :-)


> Hasanat Kazmi
> +923464362473
> _______________________________________________
> Owncloud mailing list
> Owncloud at kde.org
> https://mail.kde.org/mailman/listinfo/owncloud


--
Frank Karlitschek
karlitschek at kde.org





Reply | Threaded
Open this post in threaded view
|

Re: GSOC brainstorming.

Hasanat Kazmi
>>
>> Looking at the Technology Brainstorming page at wiki, I think I can be
>> best utilized by working on P2P sync between server through NAT as I
>> have prior experience of p2p syncing in Sahana. I was just wondering
>> that how much you guys (specially Frank Karlitschek) are interested in
>> this? I also couldn't find any information on wiki about progress (if
>> any) on this.
>
> This sounds interesting. But I don?t fully understand what you mean. It would be great if you could a more details description if you suggestion to this list.

I am suggesting to implement NAT hole punching so that server can be
accessed if it doesn't have an open ip.
This is a common scenario that home servers are behind router and can
not be accessed from out side world plus having a dedicated IP at home
is a costly business. I don't think there should be a doubt regarding
importance of this feature.

How can this be implemented:
We basically
1. need a tracker server
2. a client side software (running at both owncloud server and client machines)

But if everyone needs to download a software to access his/her files,
the whole point of accessing though browser (the current state) is
lost.

So we can mold it and make it work like this:
We will need to host a service at our end which basically mirrors your
server, e.g. by accessing  http://my-home-server at owncloud-tracker.org,  my
home server opens in browser. Multiple servers can be served from
single tracker server. The point to notice is that tracker stills has
no idea about my personal files as it can't read them. So one can
safely use someone else's tracker server. I doubt if own-cloud will
have enough resources to host trackers for others.

(If you guys let me do this in this year's SOC, i'll set a tracker
server for you guys on amazon for one year :P )

I am open to criticism and suggestions.

>
>>
>> Moreover, what are some other areas where you guys would like work
>> during this year's SOC?
>
> We have no decision if and what we do for GSOC this year.
>
>
> Do you have suggestions? ?:-)
>
>

Reply | Threaded
Open this post in threaded view
|

Re: GSOC brainstorming.

Bugs Bane
While this is a feature used more when people are running OwnCloud from
a server at home, than when they upload to a webhost, Hasanat is correct
that this would be very, very useful. Tonido (who could be considered a
competitor) works in this way. Because of the NAT hole punching, they
can make it super easy to get running from home with just a .deb / .rpm,
create a username / password, and you're ready to go in minutes. Compare
this to something like Diaspora which doesn't (yet) use this kind of
technology and thus took me many, many hours of quite in depth,
technical testing to get working. Average users aren't going to know
about port forwarding their router (most of which are different, making
instruction hard) and securing the result.

> I am suggesting to implement NAT hole punching so that server can be
> accessed if it doesn't have an open ip.
> This is a common scenario that home servers are behind router and can
> not be accessed from out side world plus having a dedicated IP at home
> is a costly business. I don't think there should be a doubt regarding
> importance of this feature.
>
> How can this be implemented:
> We basically
> 1. need a tracker server
> 2. a client side software (running at both owncloud server and client machines)
>
> But if everyone needs to download a software to access his/her files,
> the whole point of accessing though browser (the current state) is
> lost.
>
> So we can mold it and make it work like this:
> We will need to host a service at our end which basically mirrors your
> server, e.g. by accessing  http://my-home-server at owncloud-tracker.org,  my
> home server opens in browser. Multiple servers can be served from
> single tracker server. The point to notice is that tracker stills has
> no idea about my personal files as it can't read them. So one can
> safely use someone else's tracker server. I doubt if own-cloud will
> have enough resources to host trackers for others.
>
> (If you guys let me do this in this year's SOC, i'll set a tracker
> server for you guys on amazon for one year :P )
>
> I am open to criticism and suggestions.
>

Reply | Threaded
Open this post in threaded view
|

Re: GSOC brainstorming.

Fabio Alessandro Locati
I would like to ask about the security of this system. How can I be sure
where my data are and who can see them?

2011/1/26 Bugsbane <bugsbane at gmail.com>

> While this is a feature used more when people are running OwnCloud from
> a server at home, than when they upload to a webhost, Hasanat is correct
> that this would be very, very useful. Tonido (who could be considered a
> competitor) works in this way. Because of the NAT hole punching, they
> can make it super easy to get running from home with just a .deb / .rpm,
> create a username / password, and you're ready to go in minutes. Compare
> this to something like Diaspora which doesn't (yet) use this kind of
> technology and thus took me many, many hours of quite in depth,
> technical testing to get working. Average users aren't going to know
> about port forwarding their router (most of which are different, making
> instruction hard) and securing the result.
>
> > I am suggesting to implement NAT hole punching so that server can be
> > accessed if it doesn't have an open ip.
> > This is a common scenario that home servers are behind router and can
> > not be accessed from out side world plus having a dedicated IP at home
> > is a costly business. I don't think there should be a doubt regarding
> > importance of this feature.
> >
> > How can this be implemented:
> > We basically
> > 1. need a tracker server
> > 2. a client side software (running at both owncloud server and client
> machines)
> >
> > But if everyone needs to download a software to access his/her files,
> > the whole point of accessing though browser (the current state) is
> > lost.
> >
> > So we can mold it and make it work like this:
> > We will need to host a service at our end which basically mirrors your
> > server, e.g. by accessing  http://my-home-server at owncloud-tracker.org,
>  my
> > home server opens in browser. Multiple servers can be served from
> > single tracker server. The point to notice is that tracker stills has
> > no idea about my personal files as it can't read them. So one can
> > safely use someone else's tracker server. I doubt if own-cloud will
> > have enough resources to host trackers for others.
> >
> > (If you guys let me do this in this year's SOC, i'll set a tracker
> > server for you guys on amazon for one year :P )
> >
> > I am open to criticism and suggestions.
> >
> _______________________________________________
> Owncloud mailing list
> Owncloud at kde.org
> https://mail.kde.org/mailman/listinfo/owncloud
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mail.kde.org/pipermail/owncloud/attachments/20110127/3cea5896/attachment.htm 

Reply | Threaded
Open this post in threaded view
|

Re: GSOC brainstorming.

Hasanat Kazmi
On Thu, Jan 27, 2011 at 4:26 PM, Fabio Alessandro Locati
<flocati at grimp.eu> wrote:
> I would like to ask about the security of this system. How can I be sure
> where my data are and who can see them?
One way of securing the content is to zip it with password on fly.
Thats easily doable using built-in php modules. Moreover, every OS
nowadays has build in zipping & unzipping support.
But still the content of folders will be apparent to tracker server,
though it can't see whats inside the files. This can also be fixed by
using client-side JavaScript encryption.
About Client side JavaScript encryption:
You get folder list in encrypted from. Now the user enters password on
the webpage and it decrypts it on the fly (using js, without
communication with server). (It won't be slow because encrypted
content is very small) User can always have a glimpse on source of
webpage to ensure that it not tempered by tracker. Even if tracker is
somehow saving my decrypting password, it still can't look inside the
content.

Suggestions are welcome.


Hasanat Kazmi
+923464362473

Reply | Threaded
Open this post in threaded view
|

Re: GSOC brainstorming.

Riccardo Iaconelli
On Thursday 27 January 2011 23:52:11 Hasanat Kazmi wrote:
> One way of securing the content is to zip it with password on fly.
> Thats easily doable using built-in php modules. Moreover, every OS
> nowadays has build in zipping & unzipping support.

Uhm... what about SSL?

> But still the content of folders will be apparent to tracker server,
> though it can't see whats inside the files. This can also be fixed by
> using client-side JavaScript encryption.
> About Client side JavaScript encryption:
> You get folder list in encrypted from. Now the user enters password on
> the webpage and it decrypts it on the fly (using js, without
> communication with server). (It won't be slow because encrypted
> content is very small)

What password would this be? Javascript passwords would be quite easy to
intercept... no?

> User can always have a glimpse on source of
> webpage to ensure that it not tempered by tracker. Even if tracker is
> somehow saving my decrypting password, it still can't look inside the
> content.

I like the idea, I'm just trying to make it bulletproof. :)

Bye,
-Riccardo